Channel Capacity | Information Theory | Definitions Formulas

Channel Capacity | Information Theory | Definitions Formulas


Information theory and channel capacity theory plays vital role in communication engineering.

Communication system performs is limited by
1.      Available signal power



2.    Background noise
3.     Bandwidth limits

What is the role of communication system?

The role of communication system is to convey, from transmitter to receiver, a sequence of message selected from a finite no. of possible messages. Within a specified time interval one of these message is transmitted during the next another and so on.
1.      These messages are predetermined and known by the receiver.
2.    These messages selected for transmission during a particular interval is not known by the receiver.
3.     The receiver knows the probabilities for the selection of each message by the transmitter.

AMOUNT OF INFORMATION

Suppose we assume the allowable messages or symbols are
m1, m2, m3 -----------------------

and each having probabilities. These probabilities of occurrences of P1, P2, P3  ---------------------

While transmitting transmitter selects messages k with probabilities Pk.

If the receiver correctly identifies the message then an amount of information IK given by

IK=log2(1/PK)

This information is conveyed.
IK is dimensionless. But measured in Bit’s


The definition of information satisfies a number of useful criteria.
1.      Intuitive: the occurrence of highly probable events carries little information IK=0 for PK=1
2.    Positive: the information may not decrease upon receiving a message IK≥0 for PK≤1
3.     We gain more information when less probable message is received IK ˃ Il for PK ˂ Pl
4.   Information is additive if the messages are independent

IK,L = log(1/PKPL)  

       = log2 (1/Pk) + log2 (1/PL)

       = IK +IL

ENTROPY OR AVERAGE INFORMATION:

Average information is referred to as the Entropy.

If we have M different independent messages and that a long sequence of L message is generated. In the L message sequence we expect P1L occurrence of m1, P2L of m2,etc.

The total information in the sequence is
Ik = P1L log2(1/P1) +  P2L log2(1/P2) + P3L log2(1/P3)-------

So the average information per message interval will be

H= (Itotal /L)

   = (P1L log2(1/P1) +  P2L log2(1/P2) + P3L log2(1/P3)---) / L

   = Σ K=1M PK log2(1/ Pk)

If all the probabilities are equal then the entropy is

HmaxK=1M (1/M) log2M

         = log2M

Information Rate:

If the source of the messages generates messages at the rate “r” per second,

Average number of bits per second

R=rH

r is the information rate = n*s
  n = no.of bits per sample
  s = no. of samples per second

Channel Efficiency:

Ƞ = (R/Rmax)*100 %



Channel capacity:

Source sends r messages per second and the entropy of a message is H bits per message. The information rate is R=rH bits/second. One can intuitively reason that, for a given communication system. As the information rate increases the number of errors per second will also increase.

Shannon’s Theorem:

  1.   A given information system has a maximum rate of information C known as the channel capacity.

 2.   if the information rate R is less than C then one can approach arbitrarily small error probabilities by using intelligent coding techniques.

  3.   To get lower error probabilities, the encoder has to work on longer blocks of signal data. This entails longer delays and higher computational requirements.

                                Thus if R ≤ C then transmission may be accomplished without error in the presence of noise.

But unfortunately, Shannon’s theorem is not constructive proof.it merely state that such a coding method exist.

If R ˃ C then errors cannot be avoided regardless of the coding technique used.

 SHANNON-HARTLEY THEOREM:

The Shannon-Hartley theorem states that the channel capacity is given by

C = B log2(1 + S/N)

 Where C is the capacity in bits per second
             B is the bandwidth of the channel in HZ
             S/N is the signal to noise ratio.

NYQUIST THEOREM:

C= 2B log2 (M)
M is levels

RELATED TOPICS 

   PN-Junction Diode
                                         
   TUNNEL DIODE
                                   
   OSCILLATOR's

  Tree Tieset out set

  DUALITY

  MULTIVIBRATOR
  
  RESONANCE