# Channel with multiple directions

multi-terminal channel, multi-user channel

A communication channel for which it is possible to transmit information in several directions simultaneously. The description which follows is for a memoryless discrete-time multi-user channel with finite input and output alphabets. Let $Y _ {1} \dots Y _ {s}$ be $s$ finite sets, where (the alphabet) $Y _ {i}$ is the set of possible signals which the $i$- th transmitter can send, and let $\widetilde{Y} _ {1} \dots \widetilde{Y} _ {r}$ be $r$ finite sets, where (the alphabet) $\widetilde{Y} _ {j}$ is the set of possible signals which the $j$- th receiver can receive. Suppose further that a stochastic matrix

$$q ( y _ {1} \dots y _ {s} ; \ \widetilde{y} _ {1} \dots \widetilde{y} _ {r} ),$$

$$y _ {i} \in Y _ {i} ,\ \widetilde{y} _ {j} \in \ \widetilde{Y} _ {j} ,\ i = 1 \dots s; \ j = 1 \dots r,$$

is given. Two sets of random vectors $( \eta ^ {(} 1) \dots \eta ^ {(} n) )$ and $( \widetilde \eta {} ^ {(} 1) \dots \widetilde \eta {} ^ {(} n) )$, where $\eta ^ {(} k) = ( \eta _ {1} ^ {(} k) \dots \eta _ {s} ^ {(} k) )$, $\widetilde \eta {} ^ {(} k) = ( \widetilde \eta {} _ {1} ^ {(} k) \dots \widetilde \eta {} _ {r} ^ {(} k) )$, defined on some probability space $( \Omega , \mathfrak A , {\mathsf P} )$, are said to be connected by a segment of length $n$ of a homogeneous multi-user channel with $s$ inputs and $r$ outputs if $\eta _ {1} ^ {(} k)$ and $\widetilde \eta {} _ {j} ^ {(} k)$, ( $i = 1 \dots s$; $j = 1 \dots r$; $k = 1 \dots n$), take values in the sets $Y _ {i}$ and $\widetilde{Y} _ {j}$, respectively, and the formula

$${\mathsf P} \{ { ( \widetilde \eta {} ^ {(} 1) \dots \widetilde \eta {} ^ {(} n) ) = \ ( \widetilde{y} {} ^ {(} 1) \dots \widetilde{y} {} ^ {(} n) ) } \mid$$

$$\mid \ {} { ( \eta ^ {(} 1) \dots \eta ^ {(} n) ) = ( y ^ {(} 1) \dots y ^ {(} n) ) } \} =$$

$$= \ \prod _ {k = 1 } ^ { n } q ( y _ {1} ^ {(} k) \dots y _ {s} ^ {(} k); \widetilde{y} {} _ {1} ^ {(} k) \dots \widetilde{y} {} _ {r} ^ {(} k) )$$

holds for any $y ^ {(} k) = ( y _ {1} ^ {(} k) \dots y _ {s} ^ {(} k) )$ and $\widetilde{y} {} ^ {(} k) = ( \widetilde{y} {} _ {1} ^ {(} k) \dots \widetilde{y} {} _ {r} ^ {(} k) )$, $k = 1 \dots n$, $y _ {i} ^ {(} k) \in Y _ {i}$, $\widetilde{y} {} _ {j} ^ {(} k) \in \widetilde{Y} _ {j}$, $i = 1 \dots s$; $j = 1 \dots r$.

The intuitive interpretation is that each input and each output of a multi-user channel is situated in a different terminal (end), i.e. there are $s + r$ terminals in total. This means that a transmitter or receiver situated in some terminal cannot use information known to the transmitters or the receivers of other terminals. Multi-user channels possessing this property are often called pure channels, in contrast to mixed multi-user channels, for which there exist terminals containing simultaneously several inputs and outputs of the channel. The complexity of the analysis of mixed multi-user channels is due to the fact that the transmitters of some terminal can, in selecting the next signal to be transmitted, use the information obtained at that moment by all the receivers of the given terminal; the receivers of this terminal can in turn use all the information available in the terminal at the given moment.

The most general problem in information transmission along a pure memoryless multi-user channel, as intuitively described above, is the following. Suppose that there are $s ( 2 ^ {r} - 1)$ discrete stationary sources of information $U ( i, \Delta )$( cf. Information, source of); $i = i \dots s$, $\Delta \in D$, where $D$ is the set of all non-empty subsets of the index set $\{ 1 \dots r \}$ generating the messages $\xi ( i, \Delta ) = \{ {\xi _ {k} ( i, \Delta ) } : {k = . . . , - 1, 0, 1 , . . . } \}$, where the individual components of the message $\xi _ {k} ( i, \Delta )$ take values in some set $X ( i, \Delta )$ of volume $M ( i, \Delta )$; $\xi ( i, \Delta )$ can be required as the message intended for transmission from the $i$- th input of the channel to all outputs with indices $j \in \Delta$. The message at the $j$- th output, $j = 1 \dots r$, is a set of random processes $\{ {\widetilde \xi ( i, \Delta ; j) } : {i = 1 \dots s, \textrm{ and } \Delta \textrm{ such that } j \in \Delta } \}$, where

$$\widetilde \xi ( i, \Delta ; j) = \ \{ {\widetilde \xi _ {k} ( i, \Delta ; j) } : { k = . . . , - 1, 0, 1 , . . . } \}$$

and the components $\widetilde \xi _ {k} ( i, \Delta ; j)$ take values in $X ( i, \Delta )$. Let segments of messages

$$\{ {\xi ^ {L} ( i, \Delta ) = ( \xi _ {1} ( i, \Delta ) \dots \xi _ {L} ( i, \Delta )) } : { i = 1 \dots s; \Delta \in D } \}$$

of length $L$ be transmitted using a segment of length $N$ of a memoryless multi-user channel using the following block methods of (en)coding and decoding. The (en)coding is determined by a set of $s$( en)coding mappings $f _ {i}$ such that

$$f _ {i} : \ \prod _ {\Delta \in D } X ( i, \Delta ) \rightarrow Y _ {i} ^ {N} ,\ \ i = 1 \dots s,$$

( $Y _ {i} ^ {N}$ being the direct product of $N$ copies of $Y _ {i}$), and the decoding by a set of decoding mappings:

$$g _ {j; i, \Delta } : \ \widetilde{Y} {} _ {j} ^ {N} \rightarrow X ( i, \Delta ),\ \ j = 1 \dots r; i = 1 \dots s,$$

and $\Delta$ is such that $j \in \Delta$.

The set of (en)coding functions $\{ f _ {i} \}$ establishes a functional dependence between the segments of length $L$ of the messages of all possible sources and the segments of length $N$ of the signals at the inputs of the channel. The set of decoding functions $\{ g _ {j; i, \Delta } \}$ establishes a functional dependence between the segments of length $N$ of the signals at the outputs of the channel and the segments of length $L$ of the messages generated at each output. If the joint distribution of the segments of length $L$ of the messages, $\xi ^ {L} ( i, \Delta )$, the (en)coding and decoding functions $\{ f _ {i} \}$ and $\{ g _ {j; i, \Delta } \}$, and the transition probabilities of the memoryless multi-user channel are known, then one can calculate the error probability $p _ {e}$, defined by the formula $p _ {e} = {\mathsf P} \{ \xi ^ {L} ( i, \Delta ) \neq \widetilde \xi {} ^ {L} ( i, \Delta ; j) \textrm{ for some } i = 1 \dots s; j = 1 \dots r, \textrm{ and } \Delta \textrm{ such that } j \in \Delta \}$.

A set $\mathfrak R$ of sets (vectors) of rates $R = \{ {R ( i, \Delta ) } : {i = 1 \dots s; \Delta \in D } \}$ in the $s ( 2 ^ {r} - 1)$- dimensional Euclidean space is called the capacity region of the channel considered here if for any $\epsilon > 0$ there exist an integer $N$, (en)codings $\{ f _ {i} \}$ and decodings $\{ g _ {j; i, \Delta } \}$ such that

$${ \frac{L \mathop{\rm log} M ( i, \Delta ) }{N} } \geq \ R ( i, \Delta ) - \epsilon ,\ \ p _ {e} \leq \epsilon .$$

The problem of describing the region $\mathfrak R$ is one of the main problems in the theory of multi-user channels. In the general case this problem is unsolved. A final solution of it has been obtained only in certain special cases, for example, for multiple-access channels (that is, a multi-user channel with $r = 1$) and for a certain class of broadcast channels (that is, channels with $s = 1$).

#### References

 [1] C. Shannon, "Papers on information theory and cybernetics" , Moscow (1963) (In Russian; translated from English) [2] E.C. van der Meulen, "Advances in multi-user communication channels" , Proc. 1975 IEEE-USSR Joint Workshop Inform. Theory (Moscow, 15–19 Dec. 1975) , IEEE (1976) pp. 227–247 [3] T.M. Cover, "Broadcast channels" IEEE Trans. Inform. Theory , 18 : 1 (1972) pp. 2–14 [4] D. Slepian, J.K. Wolf, "A coding theorem for multiple access channels with correlated sources" Bell System Techn. J. , 52 : 7 (1973) pp. 1037–1076 [5] I. Csiszar, J. Körner, "Information theory. Coding theorems for discrete memoryless systems" , Akad. Kiado (1981)
How to Cite This Entry:
Channel with multiple directions. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Channel_with_multiple_directions&oldid=46307
This article was adapted from an original article by R.L. DobrushinV.V. Prelov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article