My reading notes from 'Recent Contributions To The Mathematical Theory Of Communication' by Warren Weaver. It is the first part of Mathematical Theory Of Communication

## Communication#

Communication is, broadly, all procedures by which one mind affects another. Involves more than just written or oral (e.g. arts). The means by which one mechanism affects another.

## Three Levels of Communications Problems#

- Level A. How accurately are symbols transmitted? The Technical problem.
- Level B. How precisely do transmitted symbols convey meaning? The Semantic Problem.
- Level C. How effectively is received meaning put into desired action? The Effectiveness Problem.

### Semantic Problem#

The greatest problem in communication is the illusion it has been achieved.

-- William H. Whyte

- Concerned with how accurately a receiver interprets the meaning of a message, compared to the meaning intended by the sender.
- If person X does not understand person Y, it is theoretically impossible to achieve understanding just by having Y keep speaking to X.
- Even if X claims to understand Y, this does not mean understanding was reached.
- Wide ramifications if we think of communication in general.

### Effectiveness Problem#

- It is implied that the purpose of all communication is influencing the behaviour of the receiver.
- By any definition of behaviour, communication either affects it or has no discernible effect whatsoever.
- It involves aesthetic considerations in Arts.
- In the case of speech, it involves anything from style, to psychological aspects, judgements etc.
- There is interrelation between the semantic and effectiveness problems, though not always clear how.

### The Influence Of Technical Problems#

What is developed by Shannon, addresses mostly problems in the technical level.

But it is significant that the other problems use of signal accuracies, thus affected by technical level (Level A) problems and inefficiencies.

## Communication Problems At Level A#

### Communication System#

The information source

*selects*a message from a set of possible messages.The transmitter changes the message into a signal.

The signal is then sent over a communication channel from the transmitter to the receiver.

The receiver changes the signal back to a message and hands it to it's destination.

Often while in transmission, unwanted additions may occur.

How do we measure the amount of information?

How do we measure the capacity of a communication channel?

this change of message into signal is called a coding process. What does an efficient coding process look like?

What are the characteristics of noise?

How does this work for continuous signals?

### Information#

Within communication theory, Information is more about what you *could* say than what you *do* say. Information is defined as the logarithm of the number of choices; The amount of freedom of choice we have in assembling messages.

Information should not be confused with meaning. Both meaningful and meaningless messages are equivalent as far as information theory is concerned.

From the point of view of the communication system, all the symbols chosen are governed by probabilities.

A stochastic process is a system that produces symbols according to probabilities (see 10b1b1 Stochastic Process). The special case of probabilities are dependant on previous events, is called a Markov process/Markov chain (see 20b1b2 Markov Chain).

Within these Markov, processes, exist ergodic systems. Ergodic systems are comfortably statistically regular.

In terms of communication theory, Entropy is identical to Information, which is identical to Freedom of choice. Entropy is used to quantify information. In physical systems, it measures the degree of randomness, or shuffledness.

Argued that the tendency to increased entropy gives time it's arrow.

Actual entropy divided by the Maximum entropy is the relative entropy of the source. One minus relative entropy is the redundancy of the source. The larger the Entropy, the more freedom of choice exists (the largest, when probability of each of N symbols is = 1/N) and lesser when smaller. The extreme is zero entropy (and zero information, zero choice)

Capacity measures not symbols per second, rather information per second.

### Coding#

The function of the transmitter is to encode, and that of the receiver to decode, the message

It's possible to transmit symbols over the channel at an average rate which is almost C/H, but which, can never be made to exceed C/H.

As one makes the coding more and more nearly ideal, they get longer delays in the process of coding.

### Noise#

The greater the freedom of choice, the greater the uncertainty. Uncertainty and greater information go hand in hand. Uncertainty by freedom of choice is desirable. Noise is undesirable uncertainty. To get to the useful information we have to subtract the noise.

Entropies calculated when there are two sets of symbols to consider, are called relative entropies.

Equivocation is the entropy of the message relative to the signal. Equivocation measures the average uncertainty in the message when the signal is known.

Capacity C of a noisy channel is defined to be equal to the maximum rate (in bits/second) at which useful information (i.e., total uncertainty minus noise uncertainty) can be transmitted over the channel.

### The interrelationship of the three kinds of communication problems.#

Shannon's theory, though primarily addressing technical problems (Level A), actually is helpful and suggestive for Semantic and Effectiveness problems.

The reasons for this is because of it's generality. It can be generalised to whatever one makes symbols be. The relationships it reveals indiscriminately apply to all.

The basic schematic of a communication system, will carry over to higher levels, with few additions. Thus, when one moves to levels B and C, it may prove to be essential to take account of the statistical characteristics of the destination.

It seems common at All communication levels, that error and confusion arise and fidelity decreases, when, no matter how good the coding, one tries to crowd too much over a channel (i.e., H > C).

## Entropy, Information And Meaning#

Information and meaning may prove to be like a pair of canonically conjugate variables (think Time and frequency in Fourier) where you gain clarity in one by sacrificing the other.

Alongside beauty and melody, entropy is found when parts are considered in association. It is by viewing or hearing the parts in association that beauty and melody are understood. All three are features of arrangement.

It is a pregnant thought that one of these three (entropy) should be able to figure as a commonplace quantity of science. The main reason being, it speaks the language of mathematics.