OK, I’m still being somewhat hyperbolic. I just think that “everything digital” is just as stupid as “everything analogue.” All of these technologies have their place. I am unconvinced that digital is always better than analogue, nor do I feel that digital technology is as tangible of a concept to intuit as analogue technology. I posit, though I have not yet really tried to prove, that as the world itself is approached mostly analogue by the human brain, that an analogue means of interaction makes more sense. This is regardless of the fact that, at some low level, reality is quantized. That said, we still have a wave function that makes sense, and interpretation of that falls back on some generally accepted macroscopic principles that are easily grasped without the use of math. Digital, beyond a few bits, becomes exceedingly complex to explain in plain language.
Bah, this argument is still really ill-formed and not well thought out. Maybe I shouldn’t post when I’m sleepy.
Reading the comments in your past entry, I tend to agree with tober (rare, but okay)…
Digital bitstreams give us far more flexibility and ease-of-use in the transmission of metadata (always digital) with the overall content. This is both positive and negative,, as we’ve seen, with in-band information about the content (good) and the misuse of copy protection defeating fair use (bad).
Digital encoding schemes also allow varied bit rates, which can save bandwidth in transmission. This is also good and bad. It’s good in that more content can fit in the same space, but it’s bad in that service providers tend to try and squeeze every little bit out of their infrastructure. Witness DirecTV overloading satellite transponders with 2 HD channels sometime… annoying to say the least.
All in all, I tend to prefer digital transmission schemes as long as the fidelity is there, simply because of the ease that is created in supplementing the content with digital data.
PS – the font in this text box sucks rocks.
OK, but: analog signalling can also allow varied bit rates and save bandwidth. Supplementing analog signals with digital metadata is also relatively straightforward in sidebands. Besides, once you get into the GHz range, everything has to be treated as an analogue signal anyway, so you still run into the same problems in circuit design and physical implementation. The abstraction from a user perspective can be digital if that’s what you prefer.
I say, let each technology be used for what it’s best at doing.