We will work with finite probablilities as a reference, in particular a
probability space is is a finite set of "events" ,
with probability
and
such that
We will just let the symbol
stand for a probability space
.
We want to talk about the Entropy of such a
space
Flipping a coin:
When I flip a coin I expect to learn 1 bit of information!
Flipping a two headed coin:
When I flip a two headed coin I expect to learn nothing!
Selecting a colored ball from an urn containing a red, green, yellow, and blue ball:
It is either (red or green) or (yellow or blue) . Learning this I get one bit of information. Then which member of the pair gives me another bit of information.
Selecting a colored ball from an urn containing one blue, and three yellow balls:
I expect to learn less than 2 bits of information every time I draw a ball.
If I have an event
with
one possible outcome then
If I have an event
with
two equiprobable outcomes then
If I have two independent events
and
then