An ugly example in .NET Create qr codes in .NET An ugly example UPC-A for .NET

12.1 An ugly example generate, create none none in none projectsprinting upc-a .net Before system none none atically introducing linear codes, we first give a very explicit and awkward example to illustrate several points. One point is the senselessness of exact computation rather than robust approximations. Another is the computational awkwardness of non-linear codes.

Let the codewords be 0001, 0110, and 1100, emitted with equal probabilities. The Hamming distance between two bin8.ry words (of the same length) is defined.

iReport Introduction An ugly example to be the num none for none ber of positions at which they differ. -Here, the first word is Hamming distance 3 from the other two, which 8l"eHamming distance 2 from each other. Suppose that a binary symmetric channel" has bit error probability p = 1/10.

Using this code over that channel (or really its fourth extension, so that we send 4 bits at a time) what is the probability of an uncorredibleerror We are using minimum distance decoding, so the question means what is" the "probability that a codeword will get mangled into a 4-bit word that is closer (in Hamming distance) to some other codeword than to the original codeword We"ll first compute this in the most obvious but labor-intensive approach. The naive aspect will be that we"ll try to get an exact ariswer, but this exactness will not rea.lly be relevant to anything, 80 is a bit silly: And the more trouble it takes to preserve this needless exactness the sillier it becomes.

So we"ll do a second computation in which we only get an estiII1ate rather than striving fur .an expensive and pointless precision. Let"s make a table of a.

ll possible 4-bit words and their Hamming distances from the 3 codewords. Each 4-bit word would be decoded/corrected"liB the closest codeword to it. The minimum distances are in boldface.

0000 0001 0010 0011 0100 0101 0110 0111 1000 1001 1010 1011 1100 1101 1110 1111 0001 1 0 2 1 2 1 3 2 2 1 3 2 3 2 4 3 0110 2 3 1 2 1 2 0 1 3 4 2 3 2 3 1 2 1100 2 3 3 4 1 2 2 3 1 2 2 3. ambiguous decoding ambiguous decoding ambiguous decoding ambiguo1,l8 decoding There are exa none for none ctly 4 cases where there would be ambiguous decoding, that is, where the minimum distance of the received word to a codeword is achieved for two different codewords. These received words-Cannot be co~ected (withce~~ty) in any case. _ .

" _ A possibly multi-bit error in a 4-bit ~d is not correctible if either the received word is one of those whose smallest distance to a Codeword OCCUI8 for two different codewords, or if the received word is closer (or equal) to another code~d than to the original c o d e w o r d . " " The probability that a codeword gets mangled into a given 4-bit word is completely computable just from knowledge of the number of bit-errors that woUld"turn. 12 . Linear Codes the codeword into the received word, that is, from the Hamming distance between the codeword and the received word. With error probability p, the probability of, a specific O-bit error in a 4-bit word is (1 - p)4, the probability of a specific I-bit error is (1 - p)3p, the probability ofa. specific 2-bit error is (1 - p)2p"J, of a specific 3-bit error is (1 - p)p3, and of a specific 4-bit error is p4.

With p = 1/10, these numbers are approximately. P(no error) P none for none (specific I-bit error) P(specific 2-biterror) P(specific 3-bit error) P(specific 4-bit error). = = = = =. 0.6561 0.0729 none none 0.

0081 0.0009 0.0001.

And note that there are no binomial coefficients appearing here since after all it"s not just any error that turns a given codeword into a given receiv"ed word. For example, to turn codeword 0001 into 0111, there must be bit errors at the two middle bit positions, and no other errors. Now rewrite the table above, writing the probabilities that the 4-bit words will arise as mangled versions of codewords other than the codewords closest to them.

I We also include the cases that the received word is closest to two or more codewords. That is, we are tabulating the probabilities of various mistakes in decoding: 0001 0110 1100 .0081 " .

0081 0000 .0009 .0009 0001 .

0081 . .0009 0010 .

0001 .0081 0011 ambiguous decoding .0729 .

0729 .0081 0100 .0081 .

0081 0101 .0081 .0009 0110 .

0009 .0081 0111 .0009 1000 .

0081 .0081 .0001 1001 ambiguous decoding .

0081 .0081 .0009 1010 .

0009 .0009 1011 1100 .0009 .

0081 .0081 .0009 1101 .

0001 .0729 .0729 ambiguous decoding 1110 .

0009 .0081 .0081 ambiguous decoding 1111 Thus, under each codeword, the probabilities listed are that the codeword will get mangled into the 4-bit word on the left.

The omitted cases are where the codeword gets slightly mangled, but only into a word that is still closer to the original codeword than to any other codewOrd. Since the codewords are sent with equal probabilities, the probability of an uncorrectible (or falsely correctible) received word is. ~(8um of firs none for none t column) + ~(8um of second column) + ~(8um of third column).
Copyright © . All rights reserved.