[SoundStage!]Max dB with Doug Blackburn
Back Issue Article
August 1999

Inside Phase

The recordings we play to enjoy music at home all contain a musical signal. To hear what the music sounded like originally, you want to reproduce the signal faithfully. What you want arriving at the drivers of your speakers is exactly the same waveform that was recorded, right? How can you ever hope to hear the performance as it was recorded if your equipment does things to the waveform that make it different from the original? You could argue that recordings are not always perfect -- and you would be right. But neither we nor the person who made the recording may know what the precise errors were -- and if we don't know what the errors were, precisely, we would only be taking guesses at things that could be done to undo the errors.

Because of this, it is better, at least to my way of thinking, to build an audio system that is faithful to the original audio signal as much as possible and to keep demanding recordings which are as technically accurate as they are musically evocative. How do we know if our audio systems are faithful to the recording? It isn't possible to cover everything in just one article, so in this installment we'll look at what phase is and how it can be changed by your audio system whether you want it to be changed or not. As you begin the next section, some of you more experienced readers may find the information to be very basic. Please stick with me because the last half or so of this article has stirred up quite a few experts already, and I expect that even if you are an audio expert yourself, you'll find the second half quite interesting.

The basic definition of phase

If you aren't an audio expert, you might not even really understand phase. Let me try to explain it very simply. In the Sine Wave Illustration (right), you see a sine wave in blue. The vertical dimension represents voltage. The red-orange horizontal line represents zero volts. Half of the time the voltage is higher than zero (positive), and half the time the voltage is lower than zero (negative). Time is on the horizontal axis. The voltage/wave starts at zero volts at zero time (red 0). The voltage rises to a maximum in a short period of time. This represents a quarter of a wavelength. One full wavelength is one complete positive-and-negative cycle or one peak and one valley. The illustration has one peak and one valley plus a little extra, just over one cycle. One full wavelength is said to have 360 degrees (like a circle). Half a wavelength is 180 degrees, a half circle. A quarter of a wavelength is 90 degrees, so our first rise, a quarter of a full wavelength, is 90 degrees. The voltage falls to zero in another 90 degrees (90+90=180) and continues dropping to the largest negative voltage in another 90 (180+90=270) degrees before rising back to zero in the final 90 (270+90=360) degrees.

Look at this next figure, Phase Shifted Sine Wave. This is the same waveform with a 90-degree phase shift applied. Because it is a symmetrical wave with only one frequency in it, all you see is that the waveform has been moved over a bit, a quarter of a wavelength to the left probably. See how it starts at the highest voltage instead of at zero like the original above. Phase and time are interlinked because the peaks and valleys shift forward or back in time depending on whether the phase shift is positive or negative. This 90-degree phase shift to the left would be identical to a 270-degree phase shift to the right, at least when you have a continuous sine wave you are working with. On a music signal, things are dynamic and much more complex.

Phase shifts are not limited to 0, 90, 180, 270 and 360 degrees. You could have a 10-degree phase shift or one of 23 degrees or 410 degrees. It all depends on how much things are moved in time.

Now that the basics are covered, let's move on to the interesting stuff. It wasn't until I started messing with computer programs that can generate waveforms and allow you to mess with phase and amplitude and harmonics and modulation that I realized what a wringer time/phase changes put a complex audio signal through.

Phase haseP asePh sePha ePhas Phase (phase, shifted in 60 degree steps)

For example, let's say that the Original Music Signal (right) is a representation of a sound as it was recorded and as it exists on an LP or CD (once turned back into an analog audio signal, of course). When the Original Music Signal passes through all your audio components and through the crossover of the loudspeaker (where it is broken up into two or three or four constituent parts and fed to two or three or four different drivers or driver multiples) you want it to still look like it does above. Any alterations applied to the Original Music Signal by the components or the loudspeakers change it from what it was to something else.

What would you say if the Original Music Signal started out like the figure above but looked like the figure at the (left) when it arrived at the drivers in your loudspeakers?

Do you think Figure 2 with a 80-degree phase shift would sound the same as the Original Music Signal?

Below are some more altered versions of the Original Music Signal to look at. Each one has a different amount of phase shift applied so that you can see what phase shift does to complex music signals. Phase shifting a sine wave looks pretty innocuous, but add many frequencies together and phase shift that complex signal and you see more dramatic differences like you see here.

Or this?Or this?199908_wave3_phase270.jpg (13806 bytes) 

What you see above are exact representations of what happens to an audio signal in the electrical domain when you shift the phase of the audio signal. The signal here was generated by mixing equal amounts of 440Hz with its harmonics, 880Hz, 1320Hz, 1760Hz and 2200Hz. Resolution is limited by the limit of computer monitors and the size of the images, but you can very clearly see that the complex electrical waveform is radically changed when it is phase shifted.

But can you really hear phase shift? Does phase shift affect what you hear? How can it not? When a driver in a loudspeaker sees a positive voltage, it moves outward (if the driver is connected with correct polarity; many loudspeakers have one or more drivers connected with reversed electrical polarity though). When a loudspeaker driver sees a negative voltage it moves inward (away from the listener). These motions create compressions and rarefactions in the air, and these pressure differences are what our ears hear. If the compressions and rarefactions come at different times and in different sequences, there must be something different about the sound. However, the effect is not what you might expect. Your ears still get the same amounts of the fundamental frequency and all the harmonics whether the phase is perfect or shifted enough to make the electrical waveform radically different-looking.

The audibility of phase-shifted waveforms is very controversial. There is a minority of manufacturers, mostly of loudspeakers, saying in one ear "You really ought to keep time and phase performance correct or you will be missing something." The vast majority of loudspeaker manufacturers are in your other ear telling you that the time and phase performance of loudspeakers is not what is important, that as long as the loudspeakers are "close" to "accepted" limits you will never know that the time/phase performance isn't perfect. Audiophiles are caught in the middle of this debate, often not knowing which side to believe. On the one hand is the vocal majority who have "proof" that below certain thresholds, time and phase changes to the Original Music Signal are not audible. Of course, if you try to test the audibility of time/phase on loudspeakers which are not time/phase correct, you won't learn anything.

Audiophiles worry about the distortions caused by jitter in digital audio electronics which occur in a time domain centered on 1000 picoseconds. The "accepted" limits for the audibility of time/phase (phase shifts are shifts in time, essentially) are in the range of 1ms, .001 second. That equates to 1,000,000,000 picoseconds. It may be unfair to compare 1ms audio-signal time errors to 1000 picosecond digital-bit jitter, but it is useful to put things in perspective. It certainly leaves the door open for some human discrimination ability that is not tested when human subjects are used to determine just what the "acceptable limits" for time-domain performance really are.

I do not recall in my 25 years of messing around in high-end audio that anyone in what I'll call "the high-end press" and possibly not in the "general consumer audio press" either ever illustrated the changes in the electrical signal when it is subjected to phase shifts. And in all those years I never really gave that aspect of time/phase performance any real thought either.

Imagine recording the signal on the left and playing back the signal on the right. I'm pretty sure I want a system that does not alter the original signal that much, even if there is nothing obvious in the sound that says "wrong!" Electrically, the signal is very different, so there must be some result.

It could be argued that recordings are rarely perfect in respect to phasing. OK, but why move the system closer to total chaos by inserting more phase shift into the Original Music Signal? It makes more sense to me that we minimize changes to the Original Music Signal rather than allow damage to accumulate from numerous sources. What motivation do recording engineers have to make polarity- and phase-correct recordings when they know that home equipment is just going to undo the hard work anyway? We have to start to get closer to "perfect playback" somewhere in the chain first, or the chicken and the egg will be chasing each other forever.

Where do phase shifts come from?

The most common potential sources of phase shifts are: loudspeaker crossovers other than first-order crossovers, audio transformers in tube amplifiers, any inductance or inductors in the audio signal path, steep filters like those in the analog output stages of 16-bit/44.1kHz CD players/DACs, interstage transformers in tube or solid-state components, transformers anywhere else in the audio signal path. Fortunately for us, transformers can be tuned to minimize phase shifts in the main part of the audio frequency spectrum. It is quite difficult to eliminate phase shifts from output transformers in the low bass, but wavelengths there are long and some controlled phase shift is not that big of a deal down in the low frequencies. There's not much you can do about phase shifts from digital audio components, though the new higher sampling rates permit much gentler filter slopes which will lead to virtually no phase shifting. You really want to support the new higher-sampling-rate digital audio formats (24/96, DVD-Audio, and DSD/SACD) if for no other reason than they permit doing away with the nasty brick-wall filters most of us are living with in our current 16-bit/44.1kHz digital front-ends. Speaker cables and interconnects which contain fat "network boxes" are potential sources for phase shifting also. However, designers of these devices are, we must hope, savvy enough to tune the phase shifts well above or below the audio frequency spectrum.

That leaves loudspeakers, where phase shifts run rampant in the marketplace. Look at test results in magazines where they measure phase response of loudspeakers. It is not unusual to see 90 to 180 degrees of phase shift in loudspeakers -- you will even see phase shifts greater than 360 degrees (one full cycle) if you watch for them. The next figure shows a complex musical signal, in this case, a chord made up of five separate frequencies: 220Hz, 275Hz, 327.80Hz, 440Hz and 554.40Hz, each mixed in different amounts (220Hz being the primary). The dark-blue/purple trace is the original signal with no phase shift. The bright-green signal is the same chord with a uniform 80-degree phase shift applied to the entire chord. Since loudspeakers exist which have from 0 to over 360 degrees of phase shift, an 80-degree phase shift is well within what is occurring in many commercial or DIY loudspeaker designs. The small number of red areas in this figure indicate the few points in time (horizontal axis) and voltage (vertical axis) where the phase-shifted signal and the original signal "match."

As audiophiles, we obsess over accuracy in our systems and spend $1000 to well over $10,000 for components and devices which improve the accuracy of our systems. In the figure above, there are instantaneous voltage differences between the two traces that are on the order of 25% to 40% of the peak-to-peak maximum. Audiophiles obsess over the precise position of loudspeakers. We measure within +/- one inch or less to get our speakers positioned perfectly. Yet look at the figure above. That first negative spike on the left represents the amount of time it takes sound to travel 2.5 feet (using 1100 feet/second as the speed of sound). You can see time shifts between the two traces which represent approximately one foot. Why do you obsess over moving your speakers to the exact correct position within one inch or even less if the speaker itself is introducing delays that amount to six inches to one foot of relative motion?

Two things go on when you reposition a loudspeaker. First, you are getting it to interact with the room in the most ideal way possible, and this would not be affected by these time delays I'm referring to. Second, you are getting the sound reproduced by each speaker to integrate properly at the listening position by setting the distance from each speaker to the listener accurately. I'm not saying that moving your non-time-aligned loudspeakers in increments of one inch or less isn't valuable. I'm saying that if those one-inch-or-less movements are valuable, how can we, as audiophiles, overlook other problems which are doing six inches or more shifting of apparent sound sources?

In my opinion, this ought to bother audiophiles a lot more than it has up to now. We are talking about high-end audio here. If we were talking about mid-fi, I wouldn't even mention this subject; it isn't that important in the mid-fi context. However, high-end implies something more, something greater -- striving for excellence. Building loudspeakers with first-order crossovers is not the only way to avoid phase shifts and time errors. Loudspeakers with digital-domain crossovers can also be constructed with perfect time and phase performance. Powered loudspeakers with active crossovers may also be designed without phase/time errors.

Either we are high-end or we are something less than high-end; it can't be both ways. If there are ways to do something that eliminates errors, by definition, these ought to be the standard for high-end enthusiasts and products. Maybe someday they will be. We high-enders tend to cling to our traditions and comfort zones in spite of knowing that they may not be "the ultimate." It's almost the new millenium. Could be a good time for cleaning out old traditions and starting the search for something better?

...Doug Blackburn
db@soundstage.com

 

[SoundStage!]All Contents
Copyright © 1999 SoundStage!
All Rights Reserved