Test 1r Information Theory M/CS 4890


If necessary, use the approximations.

MATH

MATH

MATH

___________________________________________________________________________

1. Shannon's Noiseless Coding Theorem(20 points):

Symbol Probability
A 1/4
B 1 /6
C 1 / 12
D 1 /6
E 1 / 4
F 1 /12

Answer:

MATH MATH

MATH


Symbol Probability
A 1 /4
B 1 / 6
C 1 /12
D 1 /6
E 1 / 4
F 1 /12
Symbol Probability
A 1 /4
E 1 / 4
B 1 / 6
D 1 /6
F 1 /12
C 1 /12
Symbol Probability
A 1 /4
E 1 / 4
B 1 / 6
D 1 /6
[F,C] 1 /6
Symbol Probability
[D,[F,C]] 1 /3
A 1 /4
E 1 / 4
B 1 / 6
Symbol Probability
[E,B] 5 /12
[D,[F,C]] 1 /3
A 1 /4
Symbol Probability
[[D,[F,C]],A] 7 /12
[E,B] 5 /12
Symbol Code
[[D,[F,C]],A] 0
[E,B] 1
Symbol Code
[D,[F,C]] 00
A 01
E 10
B 11
Symbol Code
D 000
[F,C] 001
A 01
E 10
B 11
Symbol Code
D 000
F 0010
C 0011
A 01
E 10
B 11

MATH

2. Shannon's Noisy Coding Theorem for a Binary Symmetric Channel(15 points):

Let MATH and $\QTR{Large}{R\ <C}$, then for sufficiently large integers $\QTR{Large}{n\ \ }$and MATH we can find:

such that:

While the Noisy Coding Theorem says little about $\QTR{Large}{n}$ the size of the code necessary to achieve, a low error rate for a given $\ \QTR{Large}{p\ }$it does constrain various ratios, in particular the ratio of code bits to error checking/correcting bits.

What can you say about that ratio for MATH.?

Hint: If $\QTR{Large}{e}$ is the number of error checking/correcting bits then $\QTR{Large}{k+e=n}$

Answer:

For MATH ,

MATH

MATH implies MATH implies MATH implies MATH

thus in this case $\QTR{Large}{k/e}$ MATH


.

3. (20 Points): Returning to the four ball weighing problem, 3 the same and 1 heavier or lighter and adopting the following notation:

Next, assume your strategy's first experiment is (1,?) on the left, (2,?), on the right, and (3,?) and (4,?) off the scale.

Solve the problem in two ways one lucky, assuming the result of the first experiment is D and second, unlucky assuming the result of the first experiment is B In each case

Answer:

Since there are $8$ equiprobable outcomes the Shannon Clues is that we will need 3 bits of Information to solve the problem.

First experiment is (1,?) on the left, (2,?), on the right, and (3,?) and (4,?) off the scale.

Result D - (1,?) is heavier or (2,?) is lighter, (3,G) and (4,G) - Shannon Information Obtained 2 bits since probability = 1/4

Second experiment (1,?) on the left, (3,G) on the right

Result D - (1,H) (2,G) Shannon Information Obtained 1 bit since probability = 1/2

The odd ball is (1,H). The Total Shannon Information Obtained by this result sequence is 3 bits

Result B - (1,G) (2,L) Shannon Information Obtained 1 bit since probability = 1/2

The odd ball is (2,L). The Total Shannon Information Obtained by this result sequence is 3 bits

Result B - - (1,G),(2,G), (3,?) and (4,?) - Shannon Information Obtained 1 bit since probability = 1/2

Second experiment (1,G) on the left, (3,?) on the right

Result B (3,G )-Shannon Information Obtained 1 bit since probability = 1/2

Third experiment (1,G) on the left, (4,?) on the right

Result D (4,H )-Shannon Information Obtained 1 bit since probability = 1/2

Result U (4,L )-Shannon Information Obtained 1 bit since probability = 1/2

The odd ball is (4,H) or(4,L). The Total Shannon Information Obtained by this result sequence is 3 bits

4. (20 Points):

Answer:

Since MATH has 4 vectors we would need 4 disjoint Hamming Balls in MATH , each with 5 vectors a total of 20 vectors. However MATHonly has 16 vectors.

Letting MATH,we have MATH

Again we need 20 vectors. However MATH has 32 vectors, too many!




5. (25 Points):Solve the following variant of the Three Doors Problem:

  1. The Host hides a money prize behind one of three randomly chosen doors with each probability 1/3.

  2. The Player chooses door 1.knowing the hosts preference.

  3. The Host opens either door 2 or door 3 using the following rules:

The Question: Suppose the Host opens door 3, should the Player open door 1 or open door 2.?

Your answer should consist of filling in the following outline.

  1. The Random Variable: The door that the money is hidden behind: $\QTR{Large}{M}$ , which takes on the values MATH

    with MATH

  2. The Conditional Distribution:

  3. The Random Variable: The door that the Host opens $\QTR{Large}{O}$ , which takes on the values MATH

    with MATH

  4. The Posterior Distribution:

  5. The Decoding Strategy: MATH and MATH

  6. The Marginal Information MATHusing any formula version.

Answer:

  1. The Random Variable: The door that the money is hidden behind: $\QTR{Large}{M}$ , which takes on the values MATH

    with MATH

  2. The Conditional Distribution:

    MATH

  3. The Random Variable: The door that the Host opens $\QTR{Large}{O}$ , which takes on the values MATH

    with MATH since MATH

  4. The Posterior Distribution:

    MATH

    MATH

  5. The Decoding Strategy: MATH MATH and MATH

  6. The Marginal Information MATHusing any formula version.

    MATH MATHMATH

    MATH

    MATH