Mathematical Foundations of Information Theory

by A. Y. Khinchin

Paperback, 2003

Status

Available

Call number

511

Library's review

Indeholder "The Entropy Concept in Probability Theory", " 1. Entropy of Finite Schemes", " 2. The Uniqueness Theorem", " 3. Entropy of Markov chains", " 4. Fundamental Theorems", " 5. Application to Coding Theory", "On the Fundamental Theorems of Information Theory", " Introduction", " Chapter I.
Show More
Elementary Inequalities", " 1. Two generalizations of Shannon's inequality", " 2. Three inequalities of Feinstein", " Chapter II. Ergodic Sources", " 3. Concept of a source. Stationarity. Entropy", " 4. Ergodic Sources", " 5. The E property. McMillan's theorem", " 6. The martingale concept. Doob's theorem", " 7. Auxiliary proposisions", " 8. Proof of McMillan's theorem", " Chapter III. Channels and the sources driving them", " 9. Concept of channel. Noise. Stationarity. Anticipation and memory", " 10. Connection of the channel to the source", " 11. The ergodic case", " Chapter IV. Feinstein's Fundamental Lemma", " 12. Formulation of the problem", " 13. Proof of the lemma", " Chapter V. Shannon's Theorems", " 14. Coding", " 15. The first Shannon theorem", " 16. The second Shannon theorem.", "Conclusion", "References".

"The Entropy Concept in Probability Theory" handler om ???
" 1. Entropy of Finite Schemes" handler om ???
" 2. The Uniqueness Theorem" handler om ???
" 3. Entropy of Markov chains" handler om ???
" 4. Fundamental Theorems" handler om ???
" 5. Application to Coding Theory" handler om ???
"On the Fundamental Theorems of Information Theory" handler om ???
" Introduction" handler om ???
" Chapter I. Elementary Inequalities" handler om ???
" 1. Two generalizations of Shannon's inequality" handler om ???
" 2. Three inequalities of Feinstein" handler om ???
" Chapter II. Ergodic Sources" handler om ???
" 3. Concept of a source. Stationarity. Entropy" handler om ???
" 4. Ergodic Sources" handler om ???
" 5. The E property. McMillan's theorem" handler om ???
" 6. The martingale concept. Doob's theorem" handler om ???
" 7. Auxiliary proposisions" handler om ???
" 8. Proof of McMillan's theorem" handler om ???
" Chapter III. Channels and the sources driving them" handler om ???
" 9. Concept of channel. Noise. Stationarity. Anticipation and memory" handler om ???
" 10. Connection of the channel to the source" handler om ???
" 11. The ergodic case" handler om ???
" Chapter IV. Feinstein's Fundamental Lemma" handler om ???
" 12. Formulation of the problem" handler om ???
" 13. Proof of the lemma" handler om ???
" Chapter V. Shannon's Theorems" handler om ???
" 14. Coding" handler om ???
" 15. The first Shannon theorem" handler om ???
" 16. The second Shannon theorem." handler om ???
"Conclusion" handler om ???
"References" handler om ???

Informationsteori som matematisk disciplin. Doob, Feinstein og Shannon.
Show Less

Publication

Dover Publications Inc. (2003), Paperback, 120 pages

Description

The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition.

Language

Original language

English

Original publication date

1953
1956

Physical description

120 p.; 19.8 cm

ISBN

0486604349 / 9780486604343

Local notes

Omslag: Ikke angivet
Omslaget viser nogle sinusagtige kurver med farvelagt overlap
Indskannet omslag - N650U - 150 dpi
Александр Яковлевич Хинчин (19 juli 1894 - 18 november 1959)
På omslaget står A. I. Khinchin, men jeg ved ikke hvor det I kommer fra.

Pages

120

Library's rating

Rating

½ (9 ratings; 3.7)

DDC/MDS

511
Page: 0.2532 seconds