# Channel with feedback

A communication channel used for transmission of information from a source of information (cf. Information, source of) to a receiver such that at any moment of time at the input one knows some information about the signals received at the output prior to this moment; this information can be used for the selection of the next signal to be transmitted over the channel. A description of one of the possible schemes for the transmission of information over a discrete-time channel in the presence of full feedback is given below. Let the information at the channel's input be given by a random variable $\xi$ taking values in some measurable space $( \mathfrak X , S _ {\mathfrak X } )$. The information derived to the addressee (the receiver) is a random variable $\xi$ taking values in the measurable space $( \widetilde{\mathfrak X} , S _ {\widetilde{\mathfrak X} } )$. Suppose that for the transmission of the information $\xi$ a segment of length $n$ of some discrete-time channel is used. The input and output signals $( \eta _ {1} \dots \eta _ {n} )$ and $( \widetilde \eta _ {1} \dots \widetilde \eta _ {n} )$ of this channel segment are given by random vectors whose components take values in measurable spaces $( Y , S _ {Y} )$ and $( \widetilde{Y} , S _ {\widetilde{Y} } )$, respectively. The method of transmitting information over such a channel using complete feedback is given by a set of $n$ encoding functions

$$f _ {1} ( x) , f _ {2} ( x , \widetilde{y} _ {1} ) \dots f _ {n} ( x , \widetilde{y} _ {1} \dots \widetilde{y} _ {n-} 1 ) ,$$

$$x \in \mathfrak X ,\ \widetilde{y} _ {i} \in \widetilde{Y} ,\ i = 1 \dots n - 1 ,$$

with values in $Y$, and a decoding function $g ( \widetilde{y} _ {1} \dots \widetilde{y} _ {n} )$, $\widetilde{y} _ {i} \in \widetilde{Y}$, $i = 1 \dots n$, with values in $\widetilde{\mathfrak X}$, by means of the relations

$$\eta _ {k} = f _ {k} ( \xi , \widetilde \eta _ {1} \dots \widetilde \eta _ {k-} 1 ) ,\ \ k = 1 \dots n ,$$

$$\widetilde \xi = g ( \widetilde \eta _ {1} \dots \widetilde \eta _ {n} ) .$$

These relations show that the selection of the next signal $\eta _ {k}$ for transmission over a channel with feedback depends on the transmitted information $\xi$ and also on the signals $\widetilde \eta _ {1} \dots \widetilde \eta _ {k-} 1$ received at the output of the channel prior to this moment. In reality this means that there is the possibility of instantly and noiselesly delivering information at the input of the channel about the signal received at its output. In this case it is often said that alongside the channel used for transmission of information in the forward direction (that is from the source of the information to the addressee), there is a feedback channel (that is, a channel over which information is transmitted in the backward direction) with an infinite transmission rate (capacity, see Transmission rate of a channel). A more realistic assumption is that the feedback channel is not noiseless, that is, the information about the output signal of the channel received at its input may contain random errors. In this case one talks about a channel with incomplete feedback.

As in the ordinary case, the notion of capacity of a channel with feedback enters; it is the least upper bound of the transmission rates under which information can be sent using the (en)coding and decoding methods described above, with arbitrarily small error probability. In the general case, the capacity of a channel with feedback is greater than the usual channel capacity. It has been established, however, that for memoryless channels these channel capacities coincide. For channels with complete feedback, (en)coding and decoding algorithms have been proposed that are simple to describe and highly effective.

How to Cite This Entry:
Channel with feedback. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Channel_with_feedback&oldid=46306
This article was adapted from an original article by R.L. DobrushinV.V. Prelov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article