A thought that crossed my mind after a couple of recent concert visits - it might be erring on the tech-philosophical side, but maybe someone generates resonance with it and I’d love to hear your take:
I like music like the next guy and have come to dabble a bit in slightly above-average hardware for headphones. Nothing exceptional, but a pair of cans that make me discern music that I’ve learned to call “has been mastered well” from that that seems less so. A wonderfully disillusioning experience.
I also like going to concerts - some classical, some jazz, some vocal, some singer/songwriter, some hip hop, “all of the above”.
Comparing both experiences (concert and recording), I have come to question the idea of “high fidelity”. The degree to which I might differentiate one set of instruments from another in a concert setting is mind-blowing on good hardware. Would that ever happen in a real-life concert - extremely rarely, if at all. The acoustics of most concert halls and other venues is such a compromise between a large number of opposing goals that the actual acoustic experience is mediocre at best. Sure: you experience other facets of the performance, but for listening it’s far from optimal.
So far so… predictable. However, tech allows to push in ever more … advanced ways of listening. However, what’s sold as ever more “high fidelity”, to me, more and more turns out as “highly impressive” - more impressive than real life. I’ve come to wonder if modern music hardware simply goes the way of high-fructose corn syrup for the ears. It goes far beyond being true to reality. This does not only relate to over-accentuating base (BOSE), but also the ever more “analytical” performing professional gear: there’s simply much more detail there than would be likely to experience in real life. And therein lies a certain artificial touch that everything adopts. Kind of reminds me of the immortal Futurama line of “But this is HDTV. It’s got better resolution than the real world.” In a sense, “high definition” really is a better way of putting it than “high fidelity”.
Bwt - kudos to everything who thinks now “congratulations, you have discovered the difference between a concert experience and studio work - well done”. Maybe you are right.
Since these are the years where our tech gradually exceeds the bandwidth our senses can provide (think 8K televisions), I wonder whether the old idea of “high fidelity” might not also mean to capture slight imperfections. Like the age-old discussion between film and digital photography and cinematography. Are we bound for ever more artificial content? Come to think of it: certainly we are (CGI, audio enhancements, photoshop, etc.). Is that a good thing? Does asking this question even matter? In essence: