Randomness and Entropy

Notation and Assumptions:

________________________________________________________________________________________

Examples:

  1. Flipping a coin: MATH MATH

    MATH

    When I flip a coin I expect to learn 1 bit of information!

  2. Flipping a two headed coin: MATH MATH

    MATH

    When I flip a two headed coin I expect to learn nothing!

  3. Selecting a colored ball from an urn containing a red, green, yellow, and blue ball:

    MATH MATH MATH

    MATH

    MATH

    It is either (red or green) or (yellow or blue) . Learning this I get one bit of information. Then which member of the pair gives me another bit of information.

  4. Selecting a colored ball from an urn containing one blue, and three yellow balls:

    MATH MATH MATH MATH

    MATH

    MATH

    MATH

    I expect to learn less than 2 bits of information every time I draw a ball.

Properties of Entropy:

  1. If I have an event $\QTR{Large}{E}$ with one possible outcome then MATH

  2. If I have an event $\QTR{Large}{E}$ with two equiprobable outcomes then MATH

  3. If I have two independent events MATHandMATH then MATH