Most people in the U.S. know, and have known, about high definition (HD) for a while now. At the very least, they know that TV broadcasts are switching from an analog to a digital signal. The funny thing is, I’ve run into a lot of people who don’t know what that means. I don’t want to get too far off topic, but analog is what I call “actual sound.” Analog is how we hear; it is the way our ears hear sound. Remember back in high school physics or science class when the teacher was talking about waves, or maybe in math class you learned how to graph waves on your graphing calculator? That is analog, and it applies to both audio and visual signals. You receive the signal in a wave format.
A digital signal, on the other hand, is made up of 1s and 0s. This means that there is a computer involved. The computer sends the visual in the forms of 1s and 0s. The upside to a digital signal is that it is precise, consistent, versatile, and does not degrade over time. On the downside, one will never achieve a higher quality than the pre-set quality at which the digital signal was recorded or created. This is because digital can be formed into a wave but that wave is made up of points or dots. It’s like when you zoom into a digital photo or try and blow it up, and it gets all pixelated. This set boundary on a digital signal is why true a analog signal will always beat it in terms of quality.
HD standards were out well before any form of physical HD media existed. A couple of years ago, there was a huge format war for high definition media. Who would win? Would it be HD-DVD or the new and unusual Blu-ray? Consumers were wise and let the big corporations fight it out, and when the smoke cleared, Blu-ray was on top.