Showing posts with label communication. Show all posts
Showing posts with label communication. Show all posts

Saturday, December 24, 2011

What is Discrete Memoryless Channel ?

Define :-  Discrete Memoryless Channel
A communication Channel may be defined as the path or medium through which the symbols flow to the receiver end. A  Discrete Memoryless Channel (DMC) is a statistical model with an input X and an output Y as shown in figure.

During each unit of time ,( the signaling interval ) , the channel accepts an input symbol from X, and in response an output symbol from Y. The channel is sais to be "discrete" when the alphabets of X and y are both finite.
Also,it is said to be "memoryless" and not on any of the previous inputs.




Thursday, December 22, 2011

What is Entropy in Information Theory?

What is the concept of entropy?



A communication system transmit a long sequence of symbols from an information source that is , a communication system not deal with a single message but with all possible messages.
Thus we are more interested in the average information that a source produces rather than the information content of a single source.
Hence entropy is the average information per symbol .
Or   Average information per individual message.

It is a measure of uncertainty in a random variable.

Assumptions:
1 ) The source is stationary so that the probabilities remain constant with time.
2 ) The successive symbols are statistically independent  and come from the source at an average rate of being symbols per second.

Mathematical expression :

H(X) = E(I(x)) = -∑P(x)logP(x)  bits/symbol
Where H (X) is known as entropy of the source X.



Entropy for Binary source:





Information as a measure of uncertainty | Information in communication systems

In my previous post, I gave you Introduction about Information Theory. In this post I am going to tell about
Information , its unit, its uncertainty and mathematical relation.
First coming over to the amount of information contained in a message. Have you ever wondered about this?
Let us take an example : Suppose you are planning a tour a city located in such an area where rainfall very rare.
To know about the weather forecast you will call the weather bureau and may receive one of the following information:
1) It would be hot and sunny.
2) There would be rain.
3) There would be a cyclone.
It may be observed that the amount of information received is clearly different from the three messages. The first message, just for instance, contains very little information because the weather in a desert city in summer is expected to be hot and sunny for most of the time.
The second message i.e. rain contains more information because it is not an event that occurs often.
The third message i.e. cyclonic storms will contain the maximum amount of information as compared to the previous two. This is because the third event rarely happens in the city.
Thus,
There is an Inverse relationship between the probability of an event and information associated with it.

 I(x)=f[1 / p(x)]
Where I (x) represent the information and f means (it a function of ) while 1 / p (x) represents the inverse of probability.

Properties of Information content I (x) :
1)   I (x) = 0  For p (x) =1
2)   I(x) ≥ 0
3)   I (x_i)>1(x_j) if P(x_i)<P(x_j)
4)
NEXT POST : Entropy and its poperties.                                                                              

What is information Theory?

Information Theory is branch of probability theory which can be applied to the study of the communication systems.
It is a broad mathematical discipline and it has made fundamental contributions not only to communications but also computer science,statiscal physics and probability and statistics.

This concept of Information Theory was invented by communication scintists while they were studying the statistical structure of electronic communication equipments.
A basic Communication system consists of three components:

1 ) Transmitter
2 ) Medium or channel
3 ) Receiver








When the communique or message is measureable Like current or voltage then the study of communication system is very easy but When the communique is Information ,then the study becomes quite difficult.
So, Information theory studies all these kind of Questions regarding the information ,whats its unit and how it is  transferred.

A communication system is Non-deterministic i.e. thst means it has statiscal nature.Thus a communication system has uncertainity or in other words a communication system possesses unpredictability.
The amount of information(it is a parameter) associated with an event can not be determined.

NEXT POST : Information as a measure of uncertaininty.

About The author

My photo
Himanshu Dureja is an engineering student and part time blogger.