If necessary, use the approximations.
1.
Shannon's Noiseless Coding Theorem(20 points):
State Shannon's Noiseless Coding Theorem.
Compute the Entropy of the follow alphabet and probability distribution to two decimal places.
Symbol
Probability
A
1/4
B
1 /6
C
1 / 12
D
1 /6
E
1 / 4
F
1 /12
Construct the associated Huffman Code.
What does Shannon's Noiseless Coding Theorem say about expected code lengths? Confirm this numerically by computing the
expected lengths of encoded symbols in the Huffman Code.
Answer:
Given a set of symbols with probability distribution any binary prefix code for satisfies the equation
2. Shannon's Noisy Coding Theorem for a
Binary Symmetric Channel(15 points):
Let
and
,
then for sufficiently large integers
and
we can find:
an
code
with
:.
.
,
where is
to be read
"
cannot be unabiguiously decoded.".
such that:
While the Noisy Coding Theorem says little about
the size of the code necessary to achieve, a low error rate for a given
it
does constrain various ratios, in particular the ratio of code bits to error
checking/correcting bits.
What can you say about that ratio for
.?
Hint: If
is the number of error checking/correcting bits then
Answer:
For
,
implies
implies
implies
thus in this case
3. (20 Points): Returning to the
four ball weighing problem, 3 the same and 1
heavier or lighter and adopting the following
notation:
Each ball is written (X,Y) where X is
1,2,3,or 4
and Y is G, H,
L, or ? for Good, Heavier Lighter or Don't
Know.
The result of each experiment is D, U, or
B for left side Down , Up, or scale
Balances.
First, what is the Shannon Clue?
Next, assume your strategy's first experiment is (1,?) on the
left, (2,?), on the right, and (3,?)
and (4,?) off the scale.
Solve the problem in two ways one lucky, assuming the result of the first
experiment is D and second, unlucky assuming the result of
the first experiment is B In each case
How much Shannon Information is obtained from the first experiment?
What are the necessary subsequent experiments that may be required, worst
case?
How much Shannon Information is obtained from each of these
experiments?
Finally, what is the total Shannon Information that was obtained be each of
these two sequences of experiments
Answer:
Since there are
equiprobable outcomes the Shannon Clues is that we will need
3 bits of Information to solve the problem.
First experiment is
(1,?) on the left, (2,?), on the right, and
(3,?) and (4,?) off the scale.
Result
D - (1,?) is heavier or (2,?) is lighter, (3,G) and (4,G) -
Shannon Information Obtained 2 bits since probability =
1/4
Second
experiment (1,?) on the left, (3,G) on the
right
Result
D - (1,H) (2,G) Shannon Information Obtained 1 bit
since probability = 1/2
The odd
ball is (1,H). The Total Shannon Information Obtained by this
result sequence is 3 bits
Result
B - (1,G) (2,L) Shannon Information Obtained 1 bit
since probability = 1/2
The odd
ball is (2,L). The Total Shannon Information Obtained by this
result sequence is 3 bits
Result
B - - (1,G),(2,G), (3,?) and (4,?) - Shannon Information
Obtained 1 bit since probability = 1/2
Second
experiment (1,G) on the left, (3,?) on the
right
Result
B (3,G )-Shannon Information Obtained 1 bit since probability = 1/2
Third
experiment (1,G) on the left, (4,?) on the
right
Result
D (4,H )-Shannon Information Obtained 1 bit since probability = 1/2
Result
U (4,L )-Shannon Information Obtained 1 bit since probability = 1/2
The
odd ball is (4,H) or(4,L). The Total Shannon
Information Obtained by this result sequence is 3 bits
4. (20 Points):
Perform a Hamming Ball calculation to show that we can't find a
Hamming
Code. That is a linear
transformation
so that all code vectors are Hamming Distance
apart.
We can find a
Hamming
Code. Define an appropriate
matrix
.
The rows of the matrix should be for the code words for
and
but
we need to be sure that the other two code words
and
are
also sufficiently Hamming Distance separated.
Perform a Hamming Ball calculation to show that this
Hamming
Code.is not Perfect.
Answer:
Since
has 4 vectors we would need 4 disjoint Hamming Balls in
,
each with 5 vectors a total of 20 vectors.
However
only
has 16 vectors.
Letting
,we
have
Again we need 20 vectors. However
has 32 vectors, too many!
5. (25 Points):Solve the
following variant of the Three Doors Problem:
The Host hides a money prize behind one of three randomly chosen doors with
each probability 1/3.
The Player chooses door 1.knowing the hosts preference.
The Host opens either door 2 or door 3 using the following rules:
If the money is behind door 1 open 2 with probability 2/3 and
3 with probability 1/3
If the money is behind door 2 open 3 and if it is behind 3 open 2.
The Question: Suppose the Host opens door 3, should the
Player open door 1 or open door 2.?
Your answer should consist of filling in the following outline.
The Random Variable: The door that the money is hidden
behind:
,
which takes on the values
with
The Conditional Distribution:
The Random Variable: The door that the Host opens
,
which takes on the values
with
The Posterior Distribution:
The Decoding Strategy:
and
The Marginal Information
using
any formula version.
Answer:
The Random Variable: The door that the money is hidden
behind:
,
which takes on the values
with
The Conditional Distribution:
The Random Variable: The door that the Host opens
,
which takes on the values
with
since
The Posterior Distribution:
The Decoding Strategy:
and
The Marginal Information
using
any formula version.
Symbol
Probability
A
1 /4
B
1 / 6
C
1 /12
D
1 /6
E
1 / 4
F
1 /12
Symbol
Probability
A
1 /4
E
1 / 4
B
1 / 6
D
1 /6
F
1 /12
C
1 /12
Symbol
Probability
A
1 /4
E
1 / 4
B
1 / 6
D
1 /6
[F,C]
1 /6
Symbol
Probability
[D,[F,C]]
1 /3
A
1 /4
E
1 / 4
B
1 / 6
Symbol
Probability
[E,B]
5 /12
[D,[F,C]]
1 /3
A
1 /4
Symbol
Probability
[[D,[F,C]],A]
7 /12
[E,B]
5 /12
Symbol
Code
[[D,[F,C]],A]
0
[E,B]
1
Symbol
Code
[D,[F,C]]
00
A
01
E
10
B
11
Symbol
Code
D
000
[F,C]
001
A
01
E
10
B
11
Symbol
Code
D
000
F
0010
C
0011
A
01
E
10
B
11
.