# Shannon Limit for Information Capacity

INFORMATION CAPACITY

The maximum rate at which data can be transferred through a channel without error is known as the information capacity. Bits per second are used to calculate it.

The maximum rate at which reliable communication may occur through the channel is an important factor in evaluating the performance of a digital communication system.

SHANNON’S LIMIT FOR INFORMATION CAPACITY

Shannon-Hartley capacity theorem:

The fundamental limit on the rate of error-free transmission for a power-limited, band-limited Gaussian channel is defined by Shannon's channel capacity theorem. The average received signal power S, the average noise power N, and the bandwidth B determine the information capacity C of a channel affected by additive white Gaussian Noise (AWGN). The Shannon-Hartley theorem describes the information capacity connection as follows:

C = B log2( 1+ ๐/๐), bits/s ------------------------ (1)

The noise power may be rewritten as N=NoB, where No denotes the noise power spectral density. As a result, the theorem may be expressed as

C = B log2( 1+ ๐/๐๐๐ต), bits/s ------------------------- (2)

The following are the significance of channel capacity:

I If the source's information rate R is less than or equal to the channel capacity C (R ≤ C), then proper coding can be used to achieve reliable (error-free) transmission via the channel.

(ii) It is impossible to create a code that can provide reliable (error-free) transmission via the channel if the information rate R from the source is larger than the channel capacity C (R > C).

As a result, Shannon established fundamental restrictions on information communication, resulting in the emergence of a new subject known as Information Theory.

Sreejith Hrishikesan

Sreejith Hrishikesan is a ME post graduate and has been worked as an Assistant Professor in Electronics Department in KMP College of Engineering, Ernakulam. For Assignments and Projects, Whatsapp on 8289838099.