I am trained in information theory, a science which emphasizes squeezing every last bit of efficiency out of communication channels to get a voice or video signal across in its best form. A noble endeavor no doubt, and one that has spawned a whole analog, and later digital, entertainment industry. But how much does it matter to the end-user?
My basic thesis is this: up to a certain quality people do care about how much error-free information gets across. But after that the human brain's smoothening kicks in - the apparatus in us that can skillfully ignore any small blemishes in the audio track or on the screen. While it is true that video technology already takes advantage of this "help" from the human brain (thats why finite frame rates and digitized pictures work), I have a feeling that sometimes technology needlessly pushes bits which are not useful, leading to "diminishing returns" on entertainment technology investment. After all, how many of us can make out the difference between a 192kbps-encoded MP3 file and its uncompressed counterpart that is 10 times bigger?
For another example, lets look at HDTV. Unless you are watching from close distance, the low pass filter in your eyes will substantially smooth out the sharp images on the HDTV screen. Undoubtedly HDTV looks better (but how better?) than SD TV, but is the delta enough to drive consumer pull in the mass market?
The rapid sales of HDTV s and blue-ray disks seems to suggest so. But I'd like to know what fraction of the content being seen on these HDTV s is really HD? And when the world does shift to HD, will the lowly SD TV be forgotten, going the B&W TV way? Probably not, because a vast library of content is stored in SD TV format. My kids will probably watch my old Friends and M*A*S*H DVDs or my father's music video collection (stored on VHS!). So their eyes and senses will probably accept fuzzy-ol' SD TV as well. Entertainment is about content quality first and then about technology quality.