One of the main constituents of systems for the transmission of information considered in the theory of information transmission. It is used in the mathematical description of an actual communication channel, that is, the complex of technical apparatus enabling one to transmit information from a transmitter to a receiver over physical communication lines and media in which signals are propagated from the transmitter to the receiver.
A communication channel with one transmitter and one receiver used for the transmission of information in one direction (from the transmitter to the receiver), is defined by a pair of measurable spaces , (spaces of signals transmitted by the transmitter and received by the receiver, respectively), a transition function , , , which is measurable with respect to the -algebra for fixed and is a probability measure on for each fixed (the function gives the conditional probability distribution of the signal received by the receiver under the condition that the transmitter has transmitted the signal ), and a subset of the space of all probability measures on ( defines a restriction on the distribution of the signal transmitted by the transmitter). Two random variables and defined on some probability space are connected by the channel if they take values in and , respectively, if for any the conditional probability with probability 1, and if the distribution of belongs to .
In applications one often encounters communication channels for which and are spaces of measurable functions defined on an interval and taking values, respectively, in measurable spaces and with the -algebra generated by the measurable cylindrical sets. In this case one writes , and talks about a continuous-time communication channel on the interval , for which and are random processes with values in the spaces and , respectively; the quantities and are then regarded as the transmitted and received signals at the time . Similarly, in the discrete-time case, the input and output signals of the communication channel are given by random vectors and the components of which take values in the measurable spaces and , respectively; here and are to be regarded as the transmitted and received signals at time , where is the time interval between two successive transmissions of signals over the communication channel.
Often continuous- or discrete-time communication channels are also considered on a semi-infinite interval or an interval which is infinite on both sides. For example, by a continuous-time communication channel on the semi-infinite interval one means an ordinary family of communication channels as described above, defined on all finite intervals and satisfying certain compatibility conditions. In this case each communication channel is called the segment of the communication channel . One of the possible variants of the compatibility conditions can be stated as follows. Let be an arbitrary set in , let be an arbitrary function in and let for some , , where , . Then the compatibility condition imposed on the transition functions is that for any , any , , any and functions , , ,
where is the cylindrical set in generated by in . The restrictions , , imposed on the probability measures in are also required to satisfy certain compatibility conditions.
The notion of a continuous-time communication channel on the infinite interval is defined similarly. The compatibility conditions, necessary for the definition of such a channel, are usually formulated specifically for each type of channel. In some concrete situations, the notion of a communication channel on or on is introduced directly (that is, without considering finite segments of these channels) by means of an explicit description of the processes performed on the input signal of the communication channel upon receipt of the output signal (see, for example, Gaussian channel; Additive noise). Everything said concerning a continuous-time communication channel on intervals or holds also for the analogous discrete-time communication channels.
Communication channels are divided into various classes depending on the types of the conditional distribution and of the restriction (see, for example, Memoryless channel; Channel with a finite memory; Channel with a finite number of states; Gaussian channel; Symmetric channel). A fundamental characteristic of a communication channel is the transmission rate of a channel (capacity), which characterizes the maximum possible transmission rate of information (cf. Information, transmission rate of) over this channel.
Various generalizations of communication channels, as defined above, are possible; these correspond to more general and complex systems of information transmission (see, for example, Channel with feedback; Channel with multiple directions).
|||C. Shannon, "Papers on information theory and cybernetics" , Moscow (1963) pp. 243–332 (In Russian; translated from English)|
|||R.L. Dobrushin, "A general formulation of the fundamental theorem of Shannon in information theory" Uspekhi Mat. Nauk , 14 : 6 (1959) pp. 3–104 (In Russian)|
|||J. Wolfowitz, "Coding theorems of information theory" , Springer (1964)|
|||R.G. Gallager, "Information theory and reliable communication" , Wiley (1968)|
|||A.A. Feinstein, "Foundations of information theory" , McGraw-Hill (1958)|
|||R.M. Fano, "Transmission of information. Statistical theory of communications" , M.I.T. (1963)|
|||I. Csiszar, J. Körner, "Information theory. Coding theorems for discrete memoryless systems" , Akad. Kiado (1981)|
Communication channel. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Communication_channel&oldid=19102