Noise is an inalienable property of all communication systems appearing in nature. Such noise acts against the very purpose of communication, namely the delivery of data to its destination with minimal possible distortion. This creates a problem that has been addressed by various disciplines over the past century. In particular, information theory studies the question of the maximum possible rate achievable by an ideal system under certain assumptions regarding the noise generation and structural design constraints. The study of such questions, initiated by Claude Shannon in 1948, has typically been carried out in the asymptotic limit of an infinite number of signaling degrees of freedom (blocklength).
At the same time, the increasing focus on latency and delay (such as in audio and video streaming), as well as the advent of modern sparse graph codes require characterizing the fundamental limits non-asymptotically, i.e. for blocklengths of the order of 1000. A systematic study of these practically motivated questions necessitates the development of new theoretical tools and techniques, which is the subject of this work. In particular, by obtaining precise non-asymptotic results, it is demonstrated that in many engineering problems a significant back-off from the (Shannon) capacity is incurred at finite blocklengths.
Knowledge of the behavior of the fundamental limits in the non-asymptotic regime enables the analysis of many related questions, such as the assessment of the suboptimality of modern codes, energy efficiency, benefits of feedback, effects of dynamically varying channel state, fading, etc. As a result it is shown that in several instances classical (asymptotics-based) conclusions do not hold under this more refined approach.