In this post, we will see the book Probability And Information by A. M. Yaglom; I. M. Yaglom.

# About the book

The present book, designed for a wide circle of readers (familiarity with mathematics up to high school level suffices for comprehension of all of its contents), makes, of course, no claim to serve even as an elementary introduction to the scientific information theory. We can give here only a preliminary idea of important practical applications of this theory. Similarly, it shall not be possible to deal here with the deeper purely mathematical problems connected with the in formation theory. The main aim of the authors is much simpler : it consists of acquainting the reader with certain, though not complex but highly important, new mathematical ideas, and leading him through these ideas to an understanding of one of the possible means of employing mathematical methods of modern engineering.

The book was translated from Russian by V. K. Jain and was published in 1983.

Credits to original uploader.

You can get the book here.

Follow us on The Internet Archive: https://archive.org/details/@mirtitles

Follow Us On Twitter: https://twitter.com/MirTitles

Write to us: mirtitles@gmail.com

Fork us at GitLab: https://gitlab.com/mirtitles/

Add new entries to the detailed book catalog here.

# Contents

## CHAPTER 1 Probability

1.1 Definition of Probability. Random Events and Random

Variables 1

1.2 Properties of Probability. Addition and Multiplication of

Events. Incompatible and Independent Events 7

1.3 Conditional Probability 20

1.4 The Variance of a Random Variable. Chebyshev’s Inequality and the Law of Large Numbers 26

1.5 Algebra of Events and General Definition of Probability 36

## CHAPTER 2 Entropy and Information

2.1 Entropy as a Measure of the Amount of Uncertainty 44

2.2 The Entropy of Compound Events. Conditional Entropy 59

2.3 The Concept of Information 73

2.4 Entropy (revisited). The Determination of Entropy from its Properties 93

## CHAPTER 3 The Solution of Certain Logical Problems by Calculating

Information

3.1 Simple Examples 101

3.2 The Counterfeit Coin Problem 108

3.3 Discussion 121

## CHAPTER 4 Application of Information Theory to the Problem of the

Information Transmission Through Communication Channels

4.1 Basic Concepts. Efficiency of a Code 137

4.2 Shannon-Fano and Huffman Codes. Fundamental Coding Theorem 147

4.3 Entropy and Information of Various Messages Encountered in Practice 177

4.4 Transmission of Information over Noisy Channels 258

4.5 Error-Detecting and Error-Correcting Codes 304

Appendix 1. Properties of Convex Functions 347

Appendix 2. Some Algebraic Concepts 364

Appendix 3. Table of Values of — p log p 392

Appendix 4. Short Table of the Function h(p) = —p log p — (1 — p) log (1 — p) 395

References 397

Name Index 409

Subject Index 413