Extremal Problems of Error Exponents and Capacity of Duplication Channels
One of the most stunning results of information theory is the channel coding theorem addressing the maximum rate of reliable communication over a noisy channel, known as channel capacity. In this thesis, we consider two problems emerging from the classic channel coding theorem. First, we study the extremal problems of the channel reliability function, which is the exponent with which the probability of making a wrong decision vanishes. To this end, we introduce a set of fundamental channels
... mental channels which exhibit significant monotonicity properties and invoke the theory of Chebychev systems to utilize such properties. We show that the binary symmetric channel (BSC) and binary erasure channel (BEC), which happen to be among the fundamental channels, are the two extremes of the channel reliability function. Also, we show that given a rate and a probability of error as a performance measure, BSC (BEC) needs the longest (shortest) code length to achieve such performance. While the first problem is pure theoretical, the second problem addresses a challenging practical scenario. The most fundamental assumption in the classic channel coding theorem is that we receive as many symbols as we send. In reality, however, this is not always true, e.g., a miss-sampling at a conventional receiver might duplicate a symbol. The extra symbol confuses a receiver as it has no clue about the position of duplication. Such scenarios are collectively known as channels with synchronization errors. Unlike their classic counterparts, there is only little known about either the capacity or coding techniques for channels with synchronization errors, even in their simplest forms. In this part, we study the duplication channel by introducing a series expansion for its capacity.