**What is the concept of entropy?**

A communication system transmit a long sequence of symbols from an information source that is , a communication system not deal with a single message but with all possible messages.

Thus we are more interested in the average information that a source produces rather than the information content of a single source.

**Hence entropy is the average information per symbol .**

Or

**Average information per individual message.**

It is a measure of uncertainty in a random variable.

**Assumptions:**

1 ) The source is stationary so that the probabilities remain constant with time.

2 ) The successive symbols are statistically independent and come from the source at an average rate of being symbols per second.

**Mathematical expression :**

**H(X) = E(I(x)) = -∑P(x)logP(x)**bits/symbol

Where

**H (X) is known as entropy of the source X.**

Entropy for Binary source:

Entropy for Binary source:

## No comments:

## Post a Comment