From decades the Hi-Fi industry teaches the general public that distortion is bad. Following a simple line of thought, the industry is right, distortion is wrong by definition!
But, how much distortion is acceptable in a Hi-Fi amplifier? The question might seem silly, and the industry will tell you that no distortion is acceptable. Indeed, if you read the specifications of many modern amplifiers the declared distortion is ridiculously low, the amount of zeros after the decimal separator is getting longer and longer.
Nowadays to show good numbers makes your product sell, it's like with sports cars, the higher the top speed and lower the 0-60 Mph makes you believe that your car is better, but those numbers won't tell you what lap time you can reach at the Nürburgring. With amplifiers it's the same, they won't really tell you if that amplifier will perform or not, the published data are too often a bait-and-switch to allure the poor client to depart with his/her hard earned money and buy a new expensive piece of kit, with the hope to reach the Audio Walhalla.
Don't take me wrong, a serious set of lab tests if the starting point to start to consider if a device is appropriately engineered or not; however, the industry almost never will supply the client with a comprehensive test report, too often a bunch of meaningless numbers are given to make the potential customer content. What does it mean that the THD (Total Harmonic Distortion) at full power @ 1 kHz is 0.000001%? What about the other frequencies and power settings? Can I have the linear distortion graph (stability and phase) @ -30 dB, please? Silence.
The industry mostly builds products that must look good on paper, not to perform. Fancy solutions are deployed to keep the fundamental values low, not to sound good, too often distortion values are kept very low with very audacious negative feedbacks, I'm not against negative feedback, but sometimes is used only to keep some values low at the detriment of sonic performance.
But, I'm not here just to give vent to my rants. My key question is: Is distortion significant? And, if yes, how much is too much? And again; should we care about the ultra-low distortion frenzy?
In my professional life, I listened to amplifiers with declared THD distortion above 1% that sounded as well, or even better, than the last 0.0000001% THD and IMD amp wonder. Why that? Because, in a real-life listening environment, 1% of distortion is not that much, and the difference between 1% TDH and 0.000001% is inaudible. Apart from that, there are a broad array of factors that make an amplifier sound better or worse, but let's concentrate on THD.
The evidence I want to bring to you is based on THD measurements made in a listing environment. The measure, in this particular case, was taken at the listening position in a semi-prepared room. This means that the room had no special modification to optimise it for listening, a few tweaks were put in place to improve the listening experience.
This graph shows the distortion, THD and the harmonics as a percentage of the fundamental from a frequency of 20 Hz to 20 kHz in a real-life listening environment. The lines greyed-up are the distortions that fell under the noise level, therefore inaudible, noise level, in this case, was very low, the measurement was taken in a detached house in a quiet suburb in Scotland with almost no traffic road. The system in the analysis was composed of a signal generator, The Vinyl Source LightStream Mk II LDR passive attenuator, custom designed (The Vinyl Source) 300B SET power amp, speakers are custom made (The Vinyl Source) clones of Tannoy Westminster.
The distortion values comprehend all the elements that concours in the listening experience, namely: audio system and room. The graph is referenced to the fundamental.
Here is the same measurement referred to the dB attenuation instead of percentage of signal. Small tweak to the room can change the distortion values:
Of course, this is only one of the many aspect we take into consideration when we fine-tune a listening environment, in this case shown to prove a point, distortion must be taken into consideration as a whole and not as a single value, at a single frequency and a single output of a single element. Values of total distortion between 0.5 and 2% are to consider excellent, and the element that distort the most are the speakers and the listening room.
For comparison we add the measured THD of a typical, not corrected, listening environment for a modern, solid state amplifier (declared THD of 0.007% @ 1 kHz, full power) with B&W M1 home theatre speakers.
The graph is self-explanatory, THD exceed 3-4% in several points with peaks of 40%.
The presented measurements demonstrate what everyone that makes fine-tuning of listening environment knows: under the point of view of total harmonic distortion of an audio system, the THD of an amplifier below 1% is almost not influent. What makes an amp sound better is NOT a ludicrously low THD but other parameters that the producer almost always omit to show or measure, like phase stability, linear stability, oscillation, transient response and many more.
A suggestion for all the Hi-Fi aficionados, before thinking about dumping the next few thousands of pounds in the following, new, amplifier, cartridge, esoteric cables etc., start to think to optimise your system, last but not least your listening room, it would cost a fraction and will give you superior results.
Comments