headerdesktop englezatrgrwk18iul25

MAI SUNT 00:00:00:00

MAI SUNT

X

headermobile englezatrgrwk18iul25

MAI SUNT 00:00:00:00

MAI SUNT

X

Promotii popup img

REDUCERI până la -40%!

🔖Weekend English Books🔖

&

🪂Transport GRATUIT la peste 75 lei»

Mathematical Foundations of Information Theory

Mathematical Foundations of Information Theory - Alexander I. Khinchin

Mathematical Foundations of Information Theory


The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.
In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite "scheme," and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts "to give a complete, detailed proof of both ... Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory."
Partial Contents: I. The Entropy Concept in Probability Theory -- Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory -- Two generalizations of Shannon's inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein's Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem.
Citeste mai mult

-20%

PRP: 67.89 Lei

!

Acesta este Pretul Recomandat de Producator. Pretul de vanzare al produsului este afisat mai jos.

54.31Lei

54.31Lei

67.89 Lei

Primesti 54 puncte

Important icon msg

Primesti puncte de fidelitate dupa fiecare comanda! 100 puncte de fidelitate reprezinta 1 leu. Foloseste-le la viitoarele achizitii!

Livrare in 2-4 saptamani

Descrierea produsului


The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.
In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite "scheme," and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts "to give a complete, detailed proof of both ... Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory."
Partial Contents: I. The Entropy Concept in Probability Theory -- Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory -- Two generalizations of Shannon's inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein's Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem.
Citeste mai mult

S-ar putea sa-ti placa si

Parerea ta e inspiratie pentru comunitatea Libris!

Istoricul tau de navigare

Noi suntem despre carti, si la fel este si

Newsletter-ul nostru.

Aboneaza-te la vestile literare si primesti un cupon de -10% pentru viitoarea ta comanda!

*Reducerea aplicata prin cupon nu se cumuleaza, ci se aplica reducerea cea mai mare.

Ma abonez image one
Ma abonez image one