Symbol Probability
A 1 /23
B 1 / 24
C 1 / 24
D 1 / 23
E 1 / 21
F 1 / 24
G 1 / 24
Symbol Probability
E 1 / 21
A 1 /23
D 1 / 23
F 1 / 24
G 1 / 24
B 1 / 24
C 1 / 24
Symbol Probability
E 1 / 21
A 1 /23
D 1 / 23
[B,C] 1 / 23
[F,G] 1 / 23
Symbol Probability
E 1 / 21
[[B,C],[F,G]] 1 / 22
[A,D] 1 /22
Symbol Probability
E 1 / 21
[[[B,C],[F,G]],[A,D]] 1 / 21

Symbol Code
E 0
[[[B,C],[F,G]],[A,D]] 1
Symbol Code
E 0
[[B,C],[F,G]] 10
[A,D] 11
Symbol Code
E 1
[B,C] 100
[F,G] 101
A 110
D 111
Symbol Code
E 1
B 1000
C 1001
F 1010
G 1011
A 110
D 111

Comments:



Lemma:  
Suppose we have a set of letters {a1, a2,.......,an } and Probability(ai) =1/2li where  l i is an integer greater than 0, then

if l = max(
l1, l2,.......,ln ) then there are an even number of letters with Probability(ai) =1/2l.

Proof:
Exercise: Remember  
1/2l= 1

Theorem: Suppose we have a set of letters {a
1, a2,.......,an } and Probability(
ai) =1/2li where  l i is an integer greater than 0, then in the associated Huffman Code, the length of the code for  ai is   l i.

Proof:
Exercise:  This is an induction argument. Start with
{a1, a2} then, assuming  an-1 and an  are letters of lowest probability look at the Huffman Code for {a1, a2,.......[an-1 ,an ]}
_____________________________________________________________________________

Shannon's First Big Theorem: In terms of expected length of an encoded string of letters, you can't do any better.