Coding for interactive communication
L.J. Schulman
Proceedings of 1995 IEEE International Symposium on Information Theory
Let the input to a computation problem be split between two processors connected by a communication link and let an interactive protocol be known by w h i c h, on any input, the processors can solve the problem using no more than T transmissions of bits between them, provided the channel is noiseless in each direction. We study the following question: if in fact the channel is noisy, what is the e ect upon the number of transmissions needed in order to solve the computation problem reliably?
more »
... hnologically this concern is motivated by the increasing importance of communication as a resource in computing, and by the tradeo in communications equipment b e t ween bandwidth, reliability and expense. We treat a model with random channel noise. We describe a deterministic method for simulating noiseless-channel protocols on noisy channels, with only a constant slow-down. This is an analog for general interactive protocols of Shannon's coding theorem, which deals only with data transmission, i.e. one-way protocols. We cannot use Shannon's block coding method because the bits exchanged in the protocol are determined only one at a time, dynamically, in the course of the interaction. Instead we describe a simulation protocol using a new kind of code, explicit tree codes. 1 Let the input to a computation problem be split between two processors connected by a communication link and let an interactive protocol be known by which, on any input, the processors can solve the problem using no more than T transmissions of bits between them, provided the channel is noiseless in each direction. We study the following question: if in fact the channel is noisy, what is the e ect upon the number of transmissions needed in order to solve the computation problem reliably? We focus on a model in which the channel su ers random noise. In this case an upper bound on the number of transmissions necessary is provided by the simple protocol which repeats each transmission of the noiseless-channel protocol many times (and decodes each transmission by a v ote). For this simulation to succeed, each transmission must be repeated enough times so that no simulated transmission goes wrong hence the length of the simulation increases as a superlinear function of T, e v en if only a constant probability of error is desired. Said another way, the \rate" of the simulation (number of protocol steps simulated per transmission) goes to zero as T increases. Shannon considered this matter in 1948 in his seminal study of communication 29]. Shannon studied the case of \one-way" communication problems, i.e. data transmission. His fundamental observation was that coding schemes which did not treat each bit separately, but jointly encoded large blocks of data into long codewords, could achieve v ery small error probability (exponentially small in T), while slowing down by only a constant factor relative to the T transmissions required by the noiseless-channel protocol (which can simply send the bits one by one). The constant (ratio of noiseless to noisy communication time) is a property of the channel, and is known as the Shannon capacity of the channel. The improvement i n communication rate provided by Shannon's insight is dramatic: the naive protocol can only achieve the same error probability b y repeating each bit a number of times proportional to the length of the entire original protocol (for a total of (T 2 ) communications 1 ). A precise statement is captured in Shannon's coding theorem: Theorem 1 (Shannon) Let a binary symmetric channel of capacity C be g i v e n . F or every T and every > 0 there exists a code : f0 1g T ! f 0 1g T 1 C (1+ ) a n d a d e coding map 0 : f0 1g T 1 C (1+ ) ! f 0 1g T such that every codeword t r ansmitted a c r oss the channel is decoded c orrectly with probability 1 ; e ; (T ) . The original proof of such a theorem appears in 29] (formulated already for the more general class of Markov c hannels). The rami cations of this insight for data transmission have been explored in coding theory and information theory. Recently, in computer science, communication has come to be critical to distributed computing, parallel computing, and the performance of VLSI chips. In these contexts interaction is an essential part of the communication process. If the environment i s n o i s y , it is necessary to be able to sustain interactive communication in the presence of that noise. Noise a icts interactive c o m m unications just as it does the one-way communications considered by Shannon, and for much the same reasons: physical devices are by nature noisy, and there is often a signi cant cost associated with making them so reliable that the noise can be ignored (such a s b y providing very strong transmitters, or using circuits cooled to very low temperatures). In order to mitigate such costs we m ust design our systems to operate reliably even in the presence of some noise. The ability to transmit data in the presence of noise, the subject of Shannon's and subsequent w ork, is a necessary but far from su cient condition for sustained interaction and computation. For this reason we will be concerned with the problem of achieving simultaneously high communication rate and high reliability, in arbitrary interactive protocols.
doi:10.1109/isit.1995.550439
fatcat:oqzju65ienhjhdwqzakvdw2jzy