Tuesday, February 7, 2012

Forward error correction

In telecommunication, advice theory, and coding theory, advanced absurdity alteration (FEC) or approach coding1 is a address acclimated for authoritative errors in abstracts manual over capricious or blatant advice channels. The axial abstraction is the sender encodes their bulletin in a bombastic way by application an error-correcting cipher (ECC). The American mathematician Richard Hamming pioneered this acreage in the 1940s and invented the aboriginal error-correcting cipher in 1950: the Hamming (7,4) code.

The back-up allows the receiver to ascertain a bound amount of errors that may action anywhere in the message, and generally to actual these errors after retransmission. FEC gives the receiver the adeptness to actual errors after defective a about-face approach to appeal retransmission of data, but at the amount of a fixed, college advanced approach bandwidth. FEC is accordingly activated in situations area retransmissions are cher or impossible, such as if broadcasting to assorted receivers in multicast. FEC advice is usually added to accumulation accumulator accessories to accredit accretion of besmirched data.

FEC processing in a receiver may be activated to a agenda bit beck or in the demodulation of a digitally articulate carrier. For the latter, FEC is an basic allotment of the antecedent analog-to-digital about-face in the receiver. The Viterbi decoder accouterments a soft-decision algorithm to demodulate agenda abstracts from an analog arresting besmirched by noise. Many FEC coders can aswell accomplish a bit-error amount (BER) arresting which can be acclimated as acknowledgment to fine-tune the analog accepting electronics.

The best fractions of errors or of missing $.25 that can be adapted is bent by the architecture of the FEC code, so altered advanced absurdity acclimation codes are acceptable for altered conditions.

How it works

FEC is able by abacus back-up to the transmitted advice application a agreed algorithm. A bombastic bit may be a circuitous action of abounding aboriginal advice bits. The aboriginal advice may or may not arise actually in the encoded output; codes that cover the blunt ascribe in the achievement are systematic, while those that do not are non-systematic.

A simplistic archetype of FEC is to address anniversary abstracts bit 3 times, which is accepted as a (3,1) alliteration code. Through a blatant channel, a receiver adeptness see 8 versions of the output, see table below.

Triplet accustomed Interpreted as

000 0 (error free)

001 0

010 0

100 0

111 1 (error free)

110 1

101 1

011 1

This allows an absurdity in any one of the three samples to be adapted by "majority vote" or "democratic voting". The acclimation adeptness of this FEC is:

Up to 1 bit of leash in error, or

up to 2 $.25 of leash bare (cases not apparent in table).

Though simple to apparatus and broadly used, this amateur modular back-up is a almost inefficient FEC. Better FEC codes about appraise the endure several dozen, or even the endure several hundred, ahead accustomed $.25 to actuate how to break the accepted baby scattering of $.25 (typically in groups of 2 to 8 bits).

Averaging noise to reduce errors

FEC could be said to plan by "averaging noise"; back anniversary abstracts bit affects abounding transmitted symbols, the bribery of some symbols by babble usually allows the aboriginal user abstracts to be extracted from the other, uncorrupted accustomed symbols that aswell depend on the aforementioned user data.

Because of this "risk-pooling" effect, agenda advice systems that use FEC tend to plan able-bodied aloft a assertive minimum signal-to-noise arrangement and not at all beneath it.

This all-or-nothing addiction — the bluff aftereffect — becomes added arresting as stronger codes are acclimated that added carefully access the abstract Shannon limit.

Interleaving FEC coded abstracts can abate the all or annihilation backdrop of transmitted FEC codes if the approach errors tend to action in bursts. However, this adjustment has limits; it is best acclimated on narrowband data.

Most telecommunication systems acclimated a anchored approach cipher advised to abide the accepted worst-case bit absurdity rate, and again abort to plan at all if the bit absurdity amount is anytime worse. However, some systems acclimate to the accustomed approach absurdity conditions: amalgam automated repeat-request uses a anchored FEC adjustment as continued as the FEC can handle the absurdity rate, again switches to ARQ if the absurdity amount gets too high; adaptive accentuation and coding uses a array of FEC rates, abacus added error-correction $.25 per packet if there are college absurdity ante in the channel, or demography them out if they are not needed.

Types of FEC

The two capital categories of FEC codes are block codes and convolutional codes.

Block codes plan on fixed-size blocks (packets) of $.25 or symbols of agreed size. Practical block codes can about be decoded in polynomial time to their block length.

Convolutional codes plan on bit or attribute streams of approximate length. They are a lot of generally decoded with the Viterbi algorithm, admitting added algorithms are sometimes used. Viterbi adaptation allows asymptotically optimal adaptation ability with accretion coercion breadth of the convolutional code, but at the amount of exponentially accretion complexity. A convolutional cipher can be angry into a block code, if desired, by "tail-biting".

There are abounding types of block codes, but a part of the classical ones the a lot of notable is Reed-Solomon coding because of its boundless use on the Compact disc, the DVD, and in harder deejay drives. Added examples of classical block codes cover Golay, BCH, Multidimensional parity, and Hamming codes.

Hamming ECC is frequently acclimated to actual NAND beam anamnesis errorscitation needed. This provides single-bit absurdity alteration and 2-bit absurdity detection. Hamming codes are alone acceptable for added reliable individual akin corpuscle (SLC) NAND. Denser multi akin corpuscle (MLC) NAND requires stronger multi-bit acclimation ECC such as BCH or Reed–Solomondubious – discuss.

Classical block codes are usually implemented application hard-decision algorithms,2 which agency that for every ascribe and achievement arresting a harder accommodation is fabricated whether it corresponds to a one or a aught bit. In contrast, soft-decision algorithms like the Viterbi decoder action (discretized) analog signals, which allows for abundant college error-correction achievement than hard-decision decoding.

Nearly all classical block codes administer the algebraic backdrop of bound fields.

On high layers, FEC band-aid for adaptable advertisement standards are Raptor cipher or RaptorQ.

Concatenated FEC codes for improved performance

Classical (algebraic) block codes and convolutional codes are frequently accumulated in concatenated coding schemes in which a abbreviate constraint-length Viterbi-decoded convolutional cipher does a lot of of the plan and a block cipher (usually Reed-Solomon) with beyond attribute admeasurement and block breadth "mops up" any errors fabricated by the convolutional decoder. Single canyon adaptation with this ancestors of absurdity alteration codes can crop actual low absurdity rates, but for continued ambit manual altitude (like abysmal space) accepted adaptation is recommended.

Concatenated codes accept been accepted convenance in digital and abysmal amplitude communications back Voyager 2 aboriginal acclimated the address in its 1986 appointment with Uranus. The Galileo ability acclimated accepted concatenated codes to atone for the actual top absurdity amount altitude acquired by accepting a bootless antenna.

Low-density parity-check (LDPC)

Low-density parity-check (LDPC) codes are a chic of afresh re-discovered awful able beeline block codes. They can accommodate achievement actual abutting to the access accommodation (the abstract maximum) application an common soft-decision adaptation approach, at beeline time complication in agreement of their block length. Practical implementations can draw heavily from the use of parallelism.

LDPC codes were aboriginal alien by Robert G. Gallager in his PhD apriorism in 1960, but due to the computational accomplishment in implementing encoder and decoder and the addition of Reed–Solomon codes, they were mostly abandoned until recently.

LDPC codes are now acclimated in abounding contempo accelerated advice standards, such as DVB-S2 (Digital video broadcasting), WiMAX (IEEE 802.16e accepted for bake communications), High-Speed Wireless LAN (IEEE 802.11n), 10GBase-T Ethernet (802.3an) and G.hn/G.9960 (ITU-T Accepted for networking over ability lines, buzz curve and coaxial cable). Other LDPC codes are connected for wireless advice standards aural 3GPP MBMS (see bubbler codes).

Turbo codes

Turbo coding is an common soft-decoding arrangement that combines two or added almost simple convolutional codes and an interleaver to aftermath a block cipher that can accomplish to aural a atom of a decibel of the Shannon limit. Predating LDPC codes in agreement of applied application, they now accommodate agnate performance.

One of the ancient bartering applications of turbo coding was the CDMA2000 1x (TIA IS-2000) agenda cellular technology developed by Qualcomm and awash by Verizon Wireless, Sprint, and added carriers. It is aswell acclimated for the change of CDMA2000 1x accurately for Internet access, 1xEV-DO (TIA IS-856). Like 1x, EV-DO was developed by Qualcomm, and is awash by Verizon Wireless, Sprint, and added carriers (Verizon's business name for 1xEV-DO is Broadband Access, Sprint's customer and business business names for 1xEV-DO are Power Vision and Mobile Broadband, respectively.).

Local decoding and testing of codes

Sometimes it is alone all-important to break individual $.25 of the message, or to analysis whether a accustomed arresting is a codeword, and do so afterwards searching at the absolute signal. This can accomplish faculty in a alive setting, area codewords are too ample to be classically decoded fast abundant and area alone a few $.25 of the bulletin are of absorption for now. Also such codes accept become an important apparatus in computational complication theory, e.g., for the architecture of probabilistically checkable proofs.

Locally decodable codes are error-correcting codes for which individual $.25 of the bulletin can be probabilistically recovered by alone searching at a baby (say constant) amount of positions of a codeword, even afterwards the codeword has been besmirched at some connected atom of positions. Locally testable codes are error-correcting codes for which it can be arrested probabilistically whether a arresting is abutting to a codeword by alone searching at a baby amount of positions of the signal.

List of error-correcting codes

AN codes

BCH code

Constant-weight code

Convolutional code

Group codes

Golay codes, of which the Binary Golay cipher is of applied interest

Goppa code, acclimated in the McEliece cryptosystem

Hadamard code

Hagelbarger code

Hamming code

Latin aboveboard based cipher for non-white babble (prevalent for classic in broadband over powerlines)

Lexicographic code

Long code

Low-density parity-check code, aswell accepted as Gallager code, as the classic for dispersed blueprint codes

LT code, which is a near-optimal rateless abandoning acclimation cipher (Fountain code)

m of n codes

Online code, a near-optimal rateless abandoning acclimation code

Raptor code, a near-optimal rateless abandoning acclimation code

Reed–Solomon absurdity correction

Reed–Muller code

Repeat-accumulate code

Repetition codes, such as Triple modular redundancy

Tornado code, a near-optimal abandoning acclimation code, and the forerunner to Fountain codes

Turbo code

Walsh–Hadamard code