Algorithmic Information Theory I - Computability

"The fundamental problem of communication is that of reproducing at one point either exactly or

approximately a message selected at another point. Frequently the messages have meaning; that

is they refer to or are correlated according to some system with certain physical or conceptual

entities. These semantic aspects of communication are irrelevant to the engineering problem.

The significant aspect is that the actual message is one selected from a set of possible messages.

The system must be designed to operate for each possible selection, not just the one which will

actually be chosen since this is unknown at the time of design."

Claude Shannon

The mathematical theory of communication.(1948)

"Our definition of the quantity of information has the advantage that it refers to individual objects

and not to objects treated as members of a set of objects with a probability distribution given

on it. The probabilistic definition can be convincingly applied to the information contained, for

example, in a stream of congratulatory telegrams. But it would not be clear how to apply it, for

example, to an estimate of the quantity of information contained in a novel or in the translation

of a novel into another language relative to the original. I think that the new definition is capable

of introducing in similar applications of the theory at least clarity of principle.".

Andrey. Kolmogorov.

Combinatorial foundations of information theory and the calculus of probabilities.(1983)

"The word `information' has been given different meanings by various writers in the general field of information theory. It is likely that at least a number of these will prove sufficiently useful in certain applications to deserve further study and permanent recognition. It is hardly to be expected that a single concept of information would satisfactorily account for the numerous possible applications of this general field."

Claude Shannon

(1993?)

_________________________________________________________________________________________

The Pitcher and the Pitching Coach:

While the game is going on, which the pitcher is on the mound A pitcher and pitching communicate in two ways.

  1. Using Shannon Information Theory - Hand signals, which work because of a common semantic content set.

  2. Using Kolmogorov Information Theory - Trips to the mound by the pitching coach which have to be brief because of the umpire but require the transfer of semantic content, how to pitch to the next batter that had not be previously discussed.

__________________________________________________________________________________________

Primes:

Theorem:

There are an infinite number of primes:

Proofs:

Euclid:

Suppose not, let MATH be the finite list of primes. . Let MATH Clearly none of the primes in the list divide $\QTR{Large}{m}$ But by elementary number theory there must be primes that divide

$\QTR{Large}{m}$ , not on the list.

An Encoding Argument:

A simple case:

$\QTR{Large}{2}$ is not the only prime:

Suppose it were, then every number is of the form MATH. There are $\QTR{Large}{16}$ numbers in MATH $\ $less than or equal to MATHbut only $\QTR{Large}{5}$ of the form MATH

In general:

Suppose MATH is the finite list of primes.

Suppose every MATH can be written in the form MATH , let MATHbe the set of integers in $\QTR{Large}{N}$ such that MATHfor all $\QTR{Large}{i}$.

MATH

Next, let MATH

MATH

And

MATH

since, given that $\QTR{Large}{n\ }$ exists( there are only $\QTR{Large}{n}$ primes), any MATHhas at least one MATH so

MATH. for MATH.

Completing the proof amounts to showing that for large enough $\QTR{Large}{k}$ ,

MATH

Hence, MATH contradicting the hypothesis

To show that for large enough$\ \ \QTR{Large}{k}$ ,$\ $ MATH it suffices to show:

MATH

a fairly straight forward limit argument.MATH

Some details:

Since MATH it suffices to show

$\hspace{1in}$ MATH

Finally it suffices to show

$\hspace{1in}$ MATH