The Nature of Information II

George Hrabovsky

MAST

Introduction

The previous article was concerned primary with a sequences of events that were mutually exclusive. What happens when more than one such sequence, or scheme, are combined?

Combining Schemes—Mutually Independent Sets

Let’s say that we have two finite schemes,

natinfo2_1.gif

and

natinfo2_2.gif

Say that the propbability of the joint occurence of events natinfo2_3.gif and natinfo2_4.gif is

natinfo2_5.gif

This property is called a mutually independent probability. In fact the set of events

natinfo2_6.gif

forms another finite scheme. The entropy for such a scheme is,

natinfo2_7.gif

Combining Schemes—Mutually Dependent Sets

So what if A and B are not mutually independent? Here we say that the probability of event natinfo2_8.gif of the scheme B occurs assuming that event natinfo2_9.gif occurs from scheme A is

natinfo2_10.gif

This gives us the scheme

natinfo2_11.gif

with entropy

natinfo2_12.gif

It turns out that natinfo2_13.gif is a random  variable called the expectation of H(B) in the scheme A. It is also written natinfo2_14.gif.

What does It all Mean?

The amount of information delivered by a scheme never decreases unless a another scheme is realized beforehand.

This is called a finite scheme.

Spikey Created with Wolfram Mathematica 8.0