Saturday, October 20, 2007

Turbo Codes

Until I read an article in Science News I had no idea these even existed. Turbo codes are a type of data encoding that maximizes data transfer over limited bandwidth despite a poor signal-to-noise ratio. More here.Turbo Codes were invented by Alain Glavieux (pictured right), Claude Berrou (pictured left) and Ramesh Pyndiah of France Telecom Research and Development. Nobody expected these unknown French engineers to succeed where they'd failed. The problem was sending big messages in a limited band with situation over a high noise bed. The known mathmatical limit for such a transfer was "Shannon's limit" and the gap between here and there was wide indeed.

It was way back in 1948 that Claude Shannon, at Bell Telephone Laboratories published his landmark paper. He determined the goal for data encoding. Shannon proved that at any given noise level, there is a maximum ratio between the volume of information and the ammount of redundancy required for data integrity. He is the Shannon in Shannon's limit.

The answer was turbo codes. This type of data encoding sent two different redundant sub-blocks of parity data per bit of any sent message. This transfer ran at just a niche below the hypothetical "Shannon's limit." It was the eloquent answer and was embraced quickly by mobile phones, satillite data, mobile television, and many other forms of data transfer. Eventually it'll be part of most wireless data transfer.
The way it works is that the data is encoded into three sub-blocks of bits. The first sub-block is the m-bit block of payload data. The second and third sub-blocks are encoded as n/2 parity bits for the payload data but as different permutations. This is computed using a recursive systematic convolutional code (RSC code.) these three redundant blocks of data are interleaved (arranged non-continiously.)
The message is sent, but when decoded is done so with a set of "soft decisions." The decoder produces an integer for each bit in the data block. This integer is not for computation, but is a measure of the probability whether or not the bit is a binary 0 or 1. this process occurrs with each redundant block as well and they are processed again to determine the likelihood data to reconcile differences between the blocks. The two frenchmen do give credit to the earlier work of Andrew Viterbi in 1967 with his Viterbi decoding algorithm for trellis codes. More here.

No comments:

Post a Comment