Information Theory in Mathematica

This work is being authored by George E. Hrabovsky, James E. Firmiss, and Dianna M. Hrabovsky.

These require either Mathematica 8 or later, or the free Mathematica CDF Viewer, though the viewer cannot run the programs, (you can find that here).

This is a description of how to do information theory in general, and implement it in Mathematica in particular.

In these files Eq. (x) means equation x from the chapter you are viewing. Eq. (x.y) is equation y from chapter x.

Chapter One What is Information?

Chapter Two Calculating the Entropy of a String.

Chapter Three Communications Systems and Markov Chains.

Chapter Four What is a Markov Chain?

Chapter Five Three Important Theorems About Markov Chains, and how to model them in Mathematica

Chapter Six Another important theorem, this time on stationary finite Markov Chains.

Chapter Seven Transient states.

Chapter Eight Mathematica's Entropy function, and representing Markov chains in Mathematica 9.

Chapter Nine Review of some probability theory as a prelude to channels.

Chapter Ten Autocorrelation and autocovariance.

Chapter Eleven Power spectra and Fourier transforms.

Chapter Twelve Probability Distributions and The Central Limit Theorem.

Chapter Thirteen Noise in Physical Systems

Chapter Fourteen Some Necessary Thermodynamics

Chapter Fifteen Background for Statistical Mechanics

Click here to return to the home page.