1 Communication Systems and Information Theory
1.1 Introduction
1.2 Source Models and Source Coding
1.3 Channel Models and Channel Coding
Historical Notes and References
2 AMeasure of Information
2.1 Discrete Probability:Review and Notation
2.2 Definition of Mutual Information
2.3 Average Mutual Information and Entropy
2.4 Probability and MutualInformation for Continuous Ensembles
2.5 Mutual Information for Arbitrary Ensembles
Summary and Conclusions
Historical Notes and References
3 Coding for Discrete Sources
3.1 Fixed-Length Codes
3.2 Variable-Length Code Words
3.3 A Source Coding Theorem
3.4 An Optimum Variable-Length Encoding Procedure
3.5 Discrete Stationary Sources
3.6 Markov Sources
Summary and Conclusions
Historical Notes and References
4 Discrete Memoryless Channels and Capacity
4.1 Classification of Channels
4.2 Discrete Memoryless Channels
4.3 The Converse to the Coding Theorem
4.4 Convex Functions
4.5 Finding Channel Capacity for a Discrete Memoryless Channel
4.6 Discrete Channels with Memory
Indecomposable Channels
Summary and Conclusions
Historical Notes and References
Appendix 4A
5 The Noisy-Channel Couing Theorem
5.1 Block Codes
5.2 Decoding Block Codes
5.3 Error Probability for Two Code Words
5.4 The Generalized Chebyshev Inequality and the Chermor Bound
5.5 Randomly Chosen Code Words
5.6 Many Code Words-The Coding Theorem
Properties of the Random Coding Exponent
5.7 Eror Probability for an Expurgated Ensemble of Codes
5.8 Lower Bounds to Error Probability
Block Error Probability at Rates above Capacity
5.9 The Coding Theorem for Finite-State Channels
State Known at Receiver
Summary and Conclusions
Historical Notes and References
Appendix 5A
Appendix 5B
6 Techniques for Coding and Decoding
6.1 Parity-Check Codes
Generator Matrices
Parity-Check Matrices for Systematic Parity-Check Codes
Decoding Tables
Hamming Codes
6.2 The Coding Theorem for Parity-Check Codes
6.3 Group Theory
Subgroups
Cyclic Subgroups
6.4 Fields and Polynomials
Polynomials
6.5 Cyclic Codes
6.6 Galois Fields
Maximal Length Codes and Hamming Codes
Existence of Galois Fields
6.7 BCH Codes
Iterative Algorithm for Finding o(D)
6.8 Convolutional Codes and Threshold Decoding
6.9 Sequential Decoding
Computation for Sequential Decoding
Error Probability for Sequential Decoding
6.10 Coding for Burst Noise Channels
Cyclic Codes
Convolutional Codes
Summary and Conclusions
Historical Notes and References
Appendix 6A
Appendix 6B
7 Memoryless Channels with Discrete Time
7.1 Introduction
7.2 Unconstrained Inputs
7.3 Constrained Inputs
7.4 Additive Noise and Additive Gaussian Noise
Additive Gaussian Noise with an Energy Constrained Input
7.5 Parallel Additive Gaussian Noise Channels
Summary and Conclusions
Historical Notes and References
8 Waveform Channels
8.1 Orthonormal Expansions of Signals and White Gaussian Noise
Gaussian Random Processes
Mutual Information for Continuous-Time Channels
8.2 White Gaussian Noise and Orthogonal Signals
Error Probability for Two Code Words
Error Probability for Orthogonal Code Words
8.3 Heuristic Treatment of Capacity for Channels with Additive
Gaussian Noise and Bandwidth Constraints
8.4 Representation of Linear Filters and Nonwhite Noise
Filtered Noise and the Karhunen-Loeve Expansion
Low-Pass Ideal Filters
8.5 Additive Gaussian Noise Channels with an Input Constraine in Power and Frequency
8.6 Fading Dispersive Channels
Summary and Conclusions
Historical Notes and References
9 Source Coding with a Fidelity Criterion
9.1 Introduction
9.2 Discrete Memoryless Sources and Single-Leer Distorton Measures
3.3 The