## Entropy Entropy captures the expected amount of work the attacker must do to guess our secret. $Entropy(X)=-\sum_{x \in X}{P(X=x)log_2P(X=x)}$ This relies on the fact that different variables will have different probabilities. If there was only one value possible, then the probability that it is that value will be 1, and any other value will be 1. The total entropy will end up being 0 ($log(1) = 0$). **Example:** For an n-bit random secret, the entropy is $Entropy = -2^n . {\frac{1}{2}}^n.log_2({\frac{1}{2}}^n) = n$ **Question:** What is the entropy for an 8 character password (upper case, lower case, digits, and special characters)? $Total \space characters = 26 + 26+10+6$ $Entropy = -68^8.{\frac{1}{68}}^8.log_2{({\frac{1}{68}}^8)} = log_2(68^8)$ ### Entropy and work factor The work factor is the amount of work an attacker will have to do to successfully guess the secret. The password distribution of choices is already known, and the best strategy for the attacker is to try the most popular ones first. $E(Attacker \space Work) = \sum_{xi}{i.P(xi)}$ where, $P(x1) \ge P(x2) \ge ...$ Guessing entropy is $log_xE(.)$ or $log_2 \times Work \space Factor$ in number of bits