Lecture 4: Hamming Distance

If we are trying to make sure that the correct message is entually recieved when there could be errors in its transmission, it would be nice if we could figure out, from some advance estimate of the number of errors that will occur in transmission, like being able to estimate just how mischevious your little brother is. Since we are transmitting only 0s and 1s, the only way that we can have an error is the change a 0 into a 1, a 1 into a 0 or an erasure which turns either a 0 or a 1 into a smudge. But now lets return to our example about just repeating the message of either 0 or 1 a bunch of times and focus on errors. How many errors of changing a 1 to a 0 (or vice versa with the opposite message) would be nessecary in order to make the majority vote turn out differently, so that the message would be recieved wrong?

Let's ask a slightly different question. What is the smallest number of errors must be made not just to turn the code word sent into a different codeword, in other words, another string of bits that actually would be sent using this code? This number of errors is called the Hamming distance of the code. What is the Hamming distance of the repeating the message of 0 or 1 three times code? What about the repeating n times code? What about the parity check code on four bits? The parity check code on n bits?

So how is this question, of what the Hamming distance is differrent from the first one of how many bits needed to be changed in order to get a different result in the majority vote interpretation of the repeating code? If you get nonsense is that different from somehow getting a different sensible codeword? How?

Hints and Solutioins:

This work was made possible through a grant from the National Science Foundation.