您好,欢迎来到聚文网。 登录 免费注册
信息论与可靠通信(英文版)

信息论与可靠通信(英文版)

  • 字数: 588
  • 出版社: 世界图书出版公司
  • 作者: (美)罗伯特·加拉格尔|责编:陈亮//夏丹
  • 商品条码: 9787519275945
  • 版次: 1
  • 开本: 16开
  • 页数: 588
  • 出版年份: 2020
  • 印次: 1
定价:¥148 销售价:登录后查看价格  ¥{{selectedSku?.salePrice}} 
库存: {{selectedSku?.stock}} 库存充足
{{item.title}}:
{{its.name}}
精选
内容简介
《信息论与可靠通信》 是信息领域诺贝尔奖级别泰 斗罗伯特·加拉格尔 (Robert G. Gallager)所 著的一本信息论圣经,一代 一代的信息论学者都是读着 这本世界经典成长起来的。 作者在美国麻省理工学院师 从信息论创始人克劳德·香 农(Claude E. Shannon) 及另两位最早期的香农奖得 主罗伯特·法诺(Robert M. Fano)和彼得·埃里亚斯 (Peter Elias),博士毕业 后一直在麻省理工学院任教 至今,被誉为香农以后最伟 大的信息论学者。他1960 年博士论文中提出的“低密 度奇偶校验码”是目前所有 5G设备都必用的信道编码 。《信息论与可靠通信》一 书中有不少内容是作者当年 首次提出的原创性成果,对 信息论的发展有极大的推动 作用。书中深入研究了通信 系统中信源和信道的数学模 型,并探索了构建真实世界 中信源和信道详细模型的框 架。然后,作者通过将编码 器和解码器分为两个部分进 一步阐述信息论原理,并研 究构成有效通信系统的机制 。本书适合作为电子工程、 计算机科学以及数学相关专 业的高年级本科生和研究生 的信息论课程教材,也可供 研究人员和专业人士参考。 “香农信息科学经典”系列还 出版了加拉格尔教授所著的 另两本名著《麻省理工加拉 格尔数字通信原理》和《数 据网络(第2版)》。
目录
1 Communication Systems and Information Theory 1.1 Introduction 1.2 Source Models and Source Coding 1.3 Channel Models and Channel Coding Historical Notes and References 2 AMeasure of Information 2.1 Discrete Probability:Review and Notation 2.2 Definition of Mutual Information 2.3 Average Mutual Information and Entropy 2.4 Probability and MutualInformation for Continuous Ensembles 2.5 Mutual Information for Arbitrary Ensembles Summary and Conclusions Historical Notes and References 3 Coding for Discrete Sources 3.1 Fixed-Length Codes 3.2 Variable-Length Code Words 3.3 A Source Coding Theorem 3.4 An Optimum Variable-Length Encoding Procedure 3.5 Discrete Stationary Sources 3.6 Markov Sources Summary and Conclusions Historical Notes and References 4 Discrete Memoryless Channels and Capacity 4.1 Classification of Channels 4.2 Discrete Memoryless Channels 4.3 The Converse to the Coding Theorem 4.4 Convex Functions 4.5 Finding Channel Capacity for a Discrete Memoryless Channel 4.6 Discrete Channels with Memory Indecomposable Channels Summary and Conclusions Historical Notes and References Appendix 4A 5 The Noisy-Channel Couing Theorem 5.1 Block Codes 5.2 Decoding Block Codes 5.3 Error Probability for Two Code Words 5.4 The Generalized Chebyshev Inequality and the Chermor Bound 5.5 Randomly Chosen Code Words 5.6 Many Code Words-The Coding Theorem Properties of the Random Coding Exponent 5.7 Eror Probability for an Expurgated Ensemble of Codes 5.8 Lower Bounds to Error Probability Block Error Probability at Rates above Capacity 5.9 The Coding Theorem for Finite-State Channels State Known at Receiver Summary and Conclusions Historical Notes and References Appendix 5A Appendix 5B 6 Techniques for Coding and Decoding 6.1 Parity-Check Codes Generator Matrices Parity-Check Matrices for Systematic Parity-Check Codes Decoding Tables Hamming Codes 6.2 The Coding Theorem for Parity-Check Codes 6.3 Group Theory Subgroups Cyclic Subgroups 6.4 Fields and Polynomials Polynomials 6.5 Cyclic Codes 6.6 Galois Fields Maximal Length Codes and Hamming Codes Existence of Galois Fields 6.7 BCH Codes Iterative Algorithm for Finding o(D) 6.8 Convolutional Codes and Threshold Decoding 6.9 Sequential Decoding Computation for Sequential Decoding Error Probability for Sequential Decoding 6.10 Coding for Burst Noise Channels Cyclic Codes Convolutional Codes Summary and Conclusions Historical Notes and References Appendix 6A Appendix 6B 7 Memoryless Channels with Discrete Time 7.1 Introduction 7.2 Unconstrained Inputs 7.3 Constrained Inputs 7.4 Additive Noise and Additive Gaussian Noise Additive Gaussian Noise with an Energy Constrained Input 7.5 Parallel Additive Gaussian Noise Channels Summary and Conclusions Historical Notes and References 8 Waveform Channels 8.1 Orthonormal Expansions of Signals and White Gaussian Noise Gaussian Random Processes Mutual Information for Continuous-Time Channels 8.2 White Gaussian Noise and Orthogonal Signals Error Probability for Two Code Words Error Probability for Orthogonal Code Words 8.3 Heuristic Treatment of Capacity for Channels with Additive Gaussian Noise and Bandwidth Constraints 8.4 Representation of Linear Filters and Nonwhite Noise Filtered Noise and the Karhunen-Loeve Expansion Low-Pass Ideal Filters 8.5 Additive Gaussian Noise Channels with an Input Constraine in Power and Frequency 8.6 Fading Dispersive Channels Summary and Conclusions Historical Notes and References 9 Source Coding with a Fidelity Criterion 9.1 Introduction 9.2 Discrete Memoryless Sources and Single-Leer Distorton Measures 3.3 The

蜀ICP备2024047804号

Copyright 版权所有 © jvwen.com 聚文网