What is Entropy?
There are many mathematical definitions of entropy. The mental picture I find most useful is to imagine the following:
- you are put in a room and your job is to label everything in the room with a sharpy indelible marker and masking tape.
- You are asked to label everything in the room using the binary numbering system. This binary number will be that particular objects I.D.
As you go about this you may want to number the objects you most commonly refer to with the lower digits that have less length. That way since you mention "FORK" much more often than "NUMBER 6 SCREW" you will end up having to say less digits.
The measure of entropy in this room is the number of binary digits required to number all the objects. This is entropy. The formula for this sentence that I just said is:
Entropy ~= log2N where N is the number of different types of objects in the room
Now in a probabalistic situation with outcomes x1 , x2 …. xn with P(xi) = probability of xi
…..more
