ULTRA AUDIO -- Archived Article
 

August 1, 2003

The Benefits of Break-In

Break-in time, or run-in time, is the playing time required before a new component reaches its full sonic potential -- or so they tell us. There are cynics out there who say that the seemingly ever-increasing number of hours of break-in time suggested by many manufacturers represent a customer-acceptance ploy not unlike the annoying and increasingly common use of almost unopenable packaging. If you have to destroy the packaging in order to gain access to a product, so the reasoning goes, you are less likely to try to take it back. Similarly, if you listen to anything for 100 hours, you’ll be so used to it in the end that it sounds right to you.

Though I certainly would take issue with a manufacturer or retailer whose recommended break-in time exceeds the allowable return period, I have to say that my own experience is that break-in is quite real. The naysayers who deny the existence of the phenomenon outright must be descendants of the same people, now contentedly deceased, who denied that speaker and interconnect cables make a difference to the sound.

With a little experience, few will dispute that speakers, for example, tend to sound harsh and flat right out of the box. With time, they literally loosen up mechanically. The elastic properties of driver motion achieve their intended specifications, and the speaker sounds fuller, richer, more coherent, and more open. Generally, the bass fills out as well, all of which adds up to a more satisfying sound. Interestingly, a similar effect occurs with amplifiers, CD players, and even cables, including power cords. As with speakers, the effects of break-in can be quite substantial, though the origins of break-in improvement in non-mechanical components are less obvious.

To start with, electrolytic capacitors need to be "formed." They consist of a metal film (usually aluminum) in an electrolyte solution. When a voltage of the correct polarity is applied, an extremely thin film of insulating material (aluminum oxide) is deposited on the metal by electrolysis. Since the film is the dielectric that separates the plates of the capacitor, the capacitor’s full potential is not reached until a uniform film of the optimal thickness is deposited. Not only that, but the way the forming process occurs is important. Apply too high a voltage at first, and the density of the film is not optimal, which may adversely affect the sound. Apply too little, and the film may never reach sufficient thickness.

Electrical currents passing through insulated wires create magnetic fields that interact with the material in the insulation creating an effect that some have described as "charging the dielectric." With use, the dielectric becomes fully charged and the cable sounds its best, but with disuse the charge dissipates and you have to start over again. I confess that I don’t see how that explanation makes much sense, but there you are, cables do sound better after a break-in period. In line with this logic, some manufacturers are using powered dielectrics that are actively charged by a separate power supply.

Even more interesting is the possibility that the conductive properties of the metal in the wire itself may be affected by break-in. The arrangement of electrons in the crystal matrix of a metal is theoretically fluid to some extent and may be altered, particularly in the regions around discontinuities or "grains" in the metal. As a result of these effects, all cables, even ones with a symmetrical grounding arrangement, are said to have at least some degree of directionality. It’s true. Turn a cable around the wrong way and the sound is subtly, but noticeably, poorer.

What really gets my juices flowing, however, is the likelihood that, as when forming a capacitor, there may be a right way and a wrong way to optimally break-in each component. I’ll even go so far as to say that most of us have not properly broken-in certain of our components and are never likely to with normal use. Fortunately, for cables at least, there is a solution.

They call them cable burners, or cable cookers. The one I used is the CBID1 from Nordost. It is a small beige plastic box that plugs into the wall with a variety of input and output jacks designed to accommodate both interconnects and speaker cables. You attach the cable to be cooked, turn it on, and several hours to several days later, your cable is officially broken-in. A number of other manufacturers, notably Audio Dharma, sell similar devices. At $1199, the Nordost cooker I used is not cheap. It is owned by my local audio dealer and used for the express purpose of accelerating the burn-in time of new cables for preferred customers.

What I have discovered is that the Nordost cable cooker is extremely beneficial when used to cook cables that already have months or years of playing time. When I first heard the difference between a pair of bottom-of-the-line Ecosse interconnects cooked versus uncooked (raw?), I was blown away. The cooked cable elevated the performance to an entirely new league. In comparison the uncooked version sounded flat and lifeless.

Intrigued, I decided to cook my well used, and definitely not bottom-of-the-line MIT 350s. The difference was stunning. Again, a new level of openness, transparency, vividness, and tonal accuracy was achieved. To make sure I wasn’t hearing things, I compared the cooked cables to my friend’s also well used but uncooked 350s. No comparison, the uncooked version again sounded lifeless and flat. My only quibble would be that the cooked version had a bit of a "cupped" resonance to the sound in the upper midrange, which disappeared after a few days. A similar, though not quite so dramatic, benefit was to be heard after burning-in my speaker cables. So far, after several months, it doesn’t seem to have worn off.

So impressed was I with the cost-to-benefit ratio of this procedure, that I took the trouble of examining the output of the Nordost box on an oscilloscope. Both the interconnect signal and the speaker-cable signal were low voltage and moderate current. As far as I could interpret, the interconnect burn-in signal was a sawtooth wave of oscillating frequency superimposed on a square wave, while the speaker cable signal was a square wave of oscillating frequency.

How Nordost decided on these particular signals for their cable cooker, I do not know. What I do know is that it works well enough to make this procedure a dead must for the serious audiophile (half-assed audiophiles are, of course, free to do what they like). The take-home message is "burn before you upgrade." I have no idea what kind of damage these signals might cause if plugged into a preamplifier or amplifier, but if they’re safe, they should certainly be tried there, too.

Only one question remains: If a wholesale improvement in cable performance can be had with a few hours on a cable cooker, wouldn’t doing this at the factory make even more sense? A cable manufacturer could upgrade their entire lineup up in a one fell swoop and then charge a lot more. OK, maybe that isn’t such a great idea.

...Ross Mantle
rossm@ultraaudio.com

PART OF THE SOUNDSTAGE NETWORK -- www.soundstagenetwork.com
All contents copyright Schneider Publishing Inc., all rights reserved.
Any reproduction, without permission, is prohibited.

Ultra Audio is part of the SoundStage! Network.
A world of websites and publications for audio, video, music, and movie enthusiasts.