The various functions of the communication theory will be discussed briefly which include the function of measuring the amount of information is transmitted, the amount of information being generated, the different types of sources, the interdependence of these sources and concepts such as noise, equivocation and causality of information. The chapter will provide a basic understanding of these concepts and will provide a clear understanding of the strengths and weaknesses of the theory in measuring communication taking place.
The communication theory is a purely quantitative theory used to measure how much information is transmitted to be associated in a given state of affairs and which would then in turn provide a measure as to how much information will be available at various points. This theory discusses the amounts of information being transmitted, not the type of information, as the name provides a misleading direction.
In respect to studying the amount of information generated this theory works in a sort of elimination technique removing all the possibilities which have been calculated to not exist. Eliminating all the choices which will not occur provides information as to the narrowing down of the decision. A binary decision, where the elimination of half the possibilities takes place till the required outcome is obtained, is also an elimination technique used to generate information related to selections (tossing of coins selecting the winner and eliminating the loser). Bits are the number of times a binary decision has to take place before getting to the required outcome. The general formula used to compute the amount of information generated is:
I(s) = log n
Here the I(s) is used to denote the amount of information generated by the association of the source. n is the number of equal possibilities that may or may not occur. This formula raises a point which must be distinguished that is that the amount of information (in bits) created by some state of affairs is different from the number of binary digits that will be used to represent the state of affairs. Both are different aspects. Using binary digits instead of distinguishable characteristics can cause in the creation of less efficient codes and would cause a more complex selection and elimination process. I(s) can also be referred to as the average amount of information generated by the source which is also called the entropy of the source at s.
Similarly a source r can take place. This source causes an elimination of alternatives and can be interdependent with the source s. Thus I(s) and I(r) are also interdependent. But sometimes the possibilities of occurrences are not equable. These different possibilities s1, s2,,sn can be written in the form of p(si). This is often called the surprisal of the particular event.
These formulas and probabilities prove that the communication theory deals with the sources rather than be concerned with the particular messages or the amount of information associated with the events occurrence. When calculating the average amount of information associated with a given source I(s) capable of providing different individual results then we take the surprisal values of all the specific individual possibilities of that source.