Namespaces
Variants
Actions

Difference between revisions of "Channel with feedback"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
Line 1: Line 1:
A [[Communication channel|communication channel]] used for transmission of information from a source of information (cf. [[Information, source of|Information, source of]]) to a receiver such that at any moment of time at the input one knows some information about the signals received at the output prior to this moment; this information can be used for the selection of the next signal to be transmitted over the channel. A description of one of the possible schemes for the transmission of information over a discrete-time channel in the presence of full feedback is given below. Let the information at the channel's input be given by a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c0214601.png" /> taking values in some measurable space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c0214602.png" />. The information derived to the addressee (the receiver) is a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c0214603.png" /> taking values in the measurable space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c0214604.png" />. Suppose that for the transmission of the information <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c0214605.png" /> a segment of length <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c0214606.png" /> of some discrete-time channel is used. The input and output signals <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c0214607.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c0214608.png" /> of this channel segment are given by random vectors whose components take values in measurable spaces <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c0214609.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146010.png" />, respectively. The method of transmitting information over such a channel using complete feedback is given by a set of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146011.png" /> encoding functions
+
<!--
 +
c0214601.png
 +
$#A+1 = 23 n = 0
 +
$#C+1 = 23 : ~/encyclopedia/old_files/data/C021/C.0201460 Channel with feedback
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146012.png" /></td> </tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146013.png" /></td> </tr></table>
+
A [[Communication channel|communication channel]] used for transmission of information from a source of information (cf. [[Information, source of|Information, source of]]) to a receiver such that at any moment of time at the input one knows some information about the signals received at the output prior to this moment; this information can be used for the selection of the next signal to be transmitted over the channel. A description of one of the possible schemes for the transmission of information over a discrete-time channel in the presence of full feedback is given below. Let the information at the channel's input be given by a random variable  $  \xi $
 +
taking values in some measurable space  $  ( \mathfrak X , S _ {\mathfrak X }  ) $.
 +
The information derived to the addressee (the receiver) is a random variable  $  \xi $
 +
taking values in the measurable space  $  ( \widetilde{\mathfrak X}  , S _ {\widetilde{\mathfrak X}  }  ) $.  
 +
Suppose that for the transmission of the information  $  \xi $
 +
a segment of length  $  n $
 +
of some discrete-time channel is used. The input and output signals  $  ( \eta _ {1} \dots \eta _ {n} ) $
 +
and  $  ( \widetilde \eta  _ {1} \dots \widetilde \eta  _ {n} ) $
 +
of this channel segment are given by random vectors whose components take values in measurable spaces  $  ( Y , S _ {Y} ) $
 +
and  $  ( \widetilde{Y}  , S _ {\widetilde{Y}  }  ) $,
 +
respectively. The method of transmitting information over such a channel using complete feedback is given by a set of  $  n $
 +
encoding functions
  
with values in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146014.png" />, and a decoding function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146015.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146016.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146017.png" />, with values in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146018.png" />, by means of the relations
+
$$
 +
f _ {1} ( x) , f _ {2} ( x , \widetilde{y}  _ {1} )
 +
\dots f _ {n} ( x , \widetilde{y}  _ {1} \dots \widetilde{y}  _ {n-} 1 ) ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146019.png" /></td> </tr></table>
+
$$
 +
x  \in  \mathfrak X ,\  \widetilde{y}  _ {i}  \in  \widetilde{Y}  ,\  i = 1 \dots n - 1 ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146020.png" /></td> </tr></table>
+
with values in  $  Y $,
 +
and a decoding function  $  g ( \widetilde{y}  _ {1} \dots \widetilde{y}  _ {n} ) $,
 +
$  \widetilde{y}  _ {i} \in \widetilde{Y}  $,
 +
$  i = 1 \dots n $,
 +
with values in  $  \widetilde{\mathfrak X}  $,
 +
by means of the relations
  
These relations show that the selection of the next signal <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146021.png" /> for transmission over a channel with feedback depends on the transmitted information <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146022.png" /> and also on the signals <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c021/c021460/c02146023.png" /> received at the output of the channel prior to this moment. In reality this means that there is the possibility of instantly and noiselesly delivering information at the input of the channel about the signal received at its output. In this case it is often said that alongside the channel used for transmission of information in the forward direction (that is from the source of the information to the addressee), there is a feedback channel (that is, a channel over which information is transmitted in the backward direction) with an infinite transmission rate (capacity, see [[Transmission rate of a channel|Transmission rate of a channel]]). A more realistic assumption is that the feedback channel is not noiseless, that is, the information about the output signal of the channel received at its input may contain random errors. In this case one talks about a channel with incomplete feedback.
+
$$
 +
\eta _ {k}  =  f _ {k} ( \xi , \widetilde \eta  _ {1} \dots \widetilde \eta  _ {k-} 1 ) ,\ \
 +
k = 1 \dots n ,
 +
$$
 +
 
 +
$$
 +
\widetilde \xi    =  g ( \widetilde \eta  _ {1} \dots \widetilde \eta  _ {n} ) .
 +
$$
 +
 
 +
These relations show that the selection of the next signal $  \eta _ {k} $
 +
for transmission over a channel with feedback depends on the transmitted information $  \xi $
 +
and also on the signals $  \widetilde \eta  _ {1} \dots \widetilde \eta  _ {k-} 1 $
 +
received at the output of the channel prior to this moment. In reality this means that there is the possibility of instantly and noiselesly delivering information at the input of the channel about the signal received at its output. In this case it is often said that alongside the channel used for transmission of information in the forward direction (that is from the source of the information to the addressee), there is a feedback channel (that is, a channel over which information is transmitted in the backward direction) with an infinite transmission rate (capacity, see [[Transmission rate of a channel|Transmission rate of a channel]]). A more realistic assumption is that the feedback channel is not noiseless, that is, the information about the output signal of the channel received at its input may contain random errors. In this case one talks about a channel with incomplete feedback.
  
 
As in the ordinary case, the notion of capacity of a channel with feedback enters; it is the least upper bound of the transmission rates under which information can be sent using the (en)coding and decoding methods described above, with arbitrarily small error probability. In the general case, the capacity of a channel with feedback is greater than the usual channel capacity. It has been established, however, that for memoryless channels these channel capacities coincide. For channels with complete feedback, (en)coding and decoding algorithms have been proposed that are simple to describe and highly effective.
 
As in the ordinary case, the notion of capacity of a channel with feedback enters; it is the least upper bound of the transmission rates under which information can be sent using the (en)coding and decoding methods described above, with arbitrarily small error probability. In the general case, the capacity of a channel with feedback is greater than the usual channel capacity. It has been established, however, that for memoryless channels these channel capacities coincide. For channels with complete feedback, (en)coding and decoding algorithms have been proposed that are simple to describe and highly effective.

Latest revision as of 16:43, 4 June 2020


A communication channel used for transmission of information from a source of information (cf. Information, source of) to a receiver such that at any moment of time at the input one knows some information about the signals received at the output prior to this moment; this information can be used for the selection of the next signal to be transmitted over the channel. A description of one of the possible schemes for the transmission of information over a discrete-time channel in the presence of full feedback is given below. Let the information at the channel's input be given by a random variable $ \xi $ taking values in some measurable space $ ( \mathfrak X , S _ {\mathfrak X } ) $. The information derived to the addressee (the receiver) is a random variable $ \xi $ taking values in the measurable space $ ( \widetilde{\mathfrak X} , S _ {\widetilde{\mathfrak X} } ) $. Suppose that for the transmission of the information $ \xi $ a segment of length $ n $ of some discrete-time channel is used. The input and output signals $ ( \eta _ {1} \dots \eta _ {n} ) $ and $ ( \widetilde \eta _ {1} \dots \widetilde \eta _ {n} ) $ of this channel segment are given by random vectors whose components take values in measurable spaces $ ( Y , S _ {Y} ) $ and $ ( \widetilde{Y} , S _ {\widetilde{Y} } ) $, respectively. The method of transmitting information over such a channel using complete feedback is given by a set of $ n $ encoding functions

$$ f _ {1} ( x) , f _ {2} ( x , \widetilde{y} _ {1} ) \dots f _ {n} ( x , \widetilde{y} _ {1} \dots \widetilde{y} _ {n-} 1 ) , $$

$$ x \in \mathfrak X ,\ \widetilde{y} _ {i} \in \widetilde{Y} ,\ i = 1 \dots n - 1 , $$

with values in $ Y $, and a decoding function $ g ( \widetilde{y} _ {1} \dots \widetilde{y} _ {n} ) $, $ \widetilde{y} _ {i} \in \widetilde{Y} $, $ i = 1 \dots n $, with values in $ \widetilde{\mathfrak X} $, by means of the relations

$$ \eta _ {k} = f _ {k} ( \xi , \widetilde \eta _ {1} \dots \widetilde \eta _ {k-} 1 ) ,\ \ k = 1 \dots n , $$

$$ \widetilde \xi = g ( \widetilde \eta _ {1} \dots \widetilde \eta _ {n} ) . $$

These relations show that the selection of the next signal $ \eta _ {k} $ for transmission over a channel with feedback depends on the transmitted information $ \xi $ and also on the signals $ \widetilde \eta _ {1} \dots \widetilde \eta _ {k-} 1 $ received at the output of the channel prior to this moment. In reality this means that there is the possibility of instantly and noiselesly delivering information at the input of the channel about the signal received at its output. In this case it is often said that alongside the channel used for transmission of information in the forward direction (that is from the source of the information to the addressee), there is a feedback channel (that is, a channel over which information is transmitted in the backward direction) with an infinite transmission rate (capacity, see Transmission rate of a channel). A more realistic assumption is that the feedback channel is not noiseless, that is, the information about the output signal of the channel received at its input may contain random errors. In this case one talks about a channel with incomplete feedback.

As in the ordinary case, the notion of capacity of a channel with feedback enters; it is the least upper bound of the transmission rates under which information can be sent using the (en)coding and decoding methods described above, with arbitrarily small error probability. In the general case, the capacity of a channel with feedback is greater than the usual channel capacity. It has been established, however, that for memoryless channels these channel capacities coincide. For channels with complete feedback, (en)coding and decoding algorithms have been proposed that are simple to describe and highly effective.

References

[1] R.L. Dobrushin, "Transmission of information in channels with feedback" Teor. Veroyatnost. i Primenen. , 3 : 4 (1958) pp. 395–412 (In Russian) (English summary)
[2] , Feedback communication systems , New York (1961)
[3] K.Sh. Zigangirov, "Upper bounds for the error probability in channels with feedback" Probl. Peredachi Informatsii , 6 : 2 (1970) pp. 87–92 (In Russian)
[4] G.L. Turin, "Notes on digital communication" , New York-Cincinatti-Toronto-London-Melbourne (1969)
How to Cite This Entry:
Channel with feedback. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Channel_with_feedback&oldid=17508
This article was adapted from an original article by R.L. DobrushinV.V. Prelov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article