Understanding The Soundstage: Part 1

In order to build a powerful and evocative sound-world for your music, through which to lead the listener on a well defined journey, it is essential to master an understanding of the 'soundstage'.

Once you have an awareness of how to create and control it, you can build incredible emotional experiences for the listener as the sonic landscape changes and erupts around them.

So too can the demand for EQ, compression, spatial and all other processing decisions become more apparent, as you shape and construct this acoustic environment - in to whatever you can envision.



The term is fundamentally used to describe the visualisation of an imagined auditory 'space' in which the music lives. It is something we already intuitively do in any real-world environment. Go with this for a second - picture standing on a busy street, with your eyes closed. Your attention is immediately captured by the loudest, or most unique source of sound, perhaps car after car passing by. You are aware of the much closer sound of people walking past, their regular footsteps and the rise and fall of conversations. In the distance a train pulls over a bridge while a plane engine drones overhead. 

We can picture this three dimensional space in our minds eye, with sound sources moving on a horizontal plane from one side to another, reverberant sources in the distance, drier sounds up close, all in one dynamic environment. This is your 'soundstage'; and exactly the same concept described in interpreting a piece of music and its balances.


Soundstage 3


Categorising and separating this bombardment of auditory information is essential for the brain to make any sense of where you are, what's heading towards you, and to extrapolate any meaning from the environment. It does this by differentiating patterns in pitch, volume and timing based spatial cues; identifying the nature and position of individual sources. It constantly monitors changes in these sources' sonic characteristics to understand their unique movements/progressions and construct a narrative as to what they could 'mean'.

Our brains automatically take these multiple auditory cues and process them into a single intelligible spatial environment, separating and grouping as much as possible in order to absorb the maximum amount of information. The individual car engines may become identified as one single stream of sound as you focus on the footsteps, the footsteps one group as you focus on the train, and so on. Some sounds may be obscured by another sources dominant sonic profile, especially when there is an overlap in frequency ranges. We naturally employ a foreground/background prioritisation as we move our focus. The foreground we can pull apart in great detail, down to individual notes and frequencies, while the background is deprioritised into singular steady streams of information, subconsciously monitored for changes. Sudden changes in patterns will immediately draw focus - imagine one of those car engines in our description of the soundstage abruptly raising in pitch and increasing in volume. This refreshing of attention is consistently employed in music to valuable effect through fills and end of phrase variations, continually reinvigorating your attention to the individual parts. So too is the principal of prioritisation. Consider a vocal-driven pop song. The typical listener will be mainly focused on the melody line, and behind it will be the 'rest' of the instruments, occasionally pulling focus at key emphatic moments. Maybe a guitar solo will take the fore, and the vocal will be deprioritised in to backing harmonic 'oo's' and 'aah's', or even left out altogether.

Some regular and uniform sounds can even be deprioritised to the point of being almost completely filtered from your awareness. Their familiar and unchanging nature can fast kick them to the bottom of the psychoacoustic hierarchy. You are filtering the sounds of your environment right now, perhaps without even noticing. You might have completely tuned out a clock ticking in the corner, or the whir of your computer fan.

Our experience of music is functionally similar. Instead of deciphering a hierarchy of unrelated noises that are mainly interpreted to form an objective understanding of a single environment and series of events however, these sounds are organised to correlate through harmony and rhythm, establishing an understanding of a central subjective meaning or 'emotion'. While the sound of conversation, plates, and food preparation may tell you 'I am in a restaurant', a voice singing about a loved one, a suspended chord, and a solitary un-resolving harmony may tell you 'I miss you'. These understandings are derived from structural tensions - patterns and their travel between stability and instability - and learned expressions of the medium.

The use of any harmonic or rhythmic structure in a piece of music helps unify its individual sound sources, and actually allows us to focus on more detail at once. When multiple expressions are correlated to a pattern, it is far easier to hear any simultaneous unique deviations across the voices and understand the group function of their relationships. 

One of the joys of a well mixed, well written piece of music is the resultant balance of interest and detail, in singular parts and together as a whole. When you can move your focus between independent sequences, tones and channels, accessing the detail of their contribution to a symphonic 'whole' with ease, and even be guided through this process as individual sequences flourish and develop, you likely have yourself a beautiful piece of music. Contrast is a powerful tool in this endeavour, from arrangement to mixing. This is the experientially broadening power of subjectivity, discussed further in this post.


Soundstage 2 


We can build an intricately detailed auditory space to interpret music. You may likely find you do it intuitively. If you can, as an example check out this mix, 'Reckoner' by Radiohead.

Immediately, to our right, are reverbed drums laying down a steady groove with occasional dotted shuffles. To the left is a crunchy tambourine, its high frequencies up on a steady 1/16 rhythm, before a guitar joins below it with a lower regular dotted rhythm. In the distance we hear the instruments' reverb painting out the rest of our space (The interplay between the two sides of our soundstage from the outset is hugely euphonic - they are almost completely separated in space, are unique texturally and exist within their own individual rhythms - but occasionally land together synergistically. This correlation through rhythm is the essence of how 'grooves' are built, which we'll be investigating further in a future post.)

Quite significantly, this mixing strategy has left a gap in the middle of the soundstage, spatially and harmonically. Enter our foreground element, the lead vocal and, later with the piano, central support from the bass below. A wide, stereo piano finds its own place on both sides of the soundstage as the first significant 'stereo' instrument, the harmonies filling out the midrange and tying everything together while the bass fills out the bottom frequencies of the mix. Notice how every element has entered in its own place, performing its own synergistic function. Now that the soundstage is 'full', there's only one place to go.

The moment the relentless percussion drops away (2:25) creates huge expectation, opening the soundstage up to the forthcoming build of harmonies, and revealing the creeping intimacy of the stereo backing vocals. The soundstage begins to fill again as many independent melodies weave together into a complex tapestry of harmony. Notice the individual harmonic groups - the backing vocals and strings. Much like the footsteps in our earlier example, when focusing elsewhere they may perceptually become one homogenous 'part', but equally upon focus you will notice individual voices and melodies traveling in their own directions.

You might find different tones conjure different textures, colours or gradations of warmth, like 'glassy' pianos and 'soft' vocals. The more descriptive detail you can breathe into this perception of the soundstage when interpreting music, the better you can experience and manage the balances. 

With our hands at the controls, we can decide which sounds live together, which stand on their own, which live in a state of juxtaposed tension, and craft hugely imaginative balances of sonic architecture. Modulating these balances over time through arrangement and automation of a mixes dimensions establishes a journey and narrative, creating an experience filled with meaning.


This is a general framework for perceiving the dimensional soundstage - as we'll see in the later creative parts of this series, they are in fact all related, and changing one can easily alter your perception of another.

1. Space. Horizontal
- Think of sounds panning left or right, reverberations spreading outwards from the source, mono material firmly in the 'center'.
    2. Frequency. Vertical
    - We already talk of 'high' and 'low' frequencies, classically used to comedic effect in old cartoons. Follow basslines walking 'up' and 'down' the scale, hi hats picking out rhythms at the top.
      3. Volume. Front to Back
      - Think of a sound growing in volume, getting louder as it gets closer, or diminishing and disappearing into the distance. Overcompressed mixes can sound 'flat' dynamically, rather than deep and contoured.
        4. Time. The dimension in which the previous three develop.

        - This one's the hardest to describe but it should be intuitive. If you can picture a single moment in the soundstage, and connect its movement to another moment, here is your passage of time. (The nature of time continues to puzzle physicists and philosophers alike, with lifetimes spent wondering what that duration of research even was).

        Every sound can be described through properties of all four, and cannot exist without a property of each one. You might have the loudest, bass-iest note imaginable, but with no duration you've got nothing to hear. Or perhaps infinite duration of all possible frequencies and spatial positions, but again - with no volume, you'll never know.


        Soundstage 1

        THE POINT

        And here's why visualising the soundstage is so important: it's an intuitive and incredibly powerful way of processing all of these details at once. And by its merit, fully experiencing the musical journey in its totality. 

        When you have a firm grasp of these dimensions, and can think of ways to make a sound travel through them using the musical/audio processing tools available, you can begin creating entirely new and dynamic sonic experiences. You can see the 'space' for a new harmonic layer, or where a channel overlaps too much with another, and have complete artistic control over the separation and power of the music you're creating.

        Now that we know what it is, in Part 2 we will delve further in to the fun stuff: describing the creative compositional and processing tools available to build and redesign the acoustic landscape, as well as some of their unique psychoacoustic effects.

        In Part 3, armed with perspective and knowledge of the tools, we will take a detailed look at some professional mixes and explain how they work in this dimensional frame for emotive sonic impact.

        Leave a comment

        Please note, comments must be approved before they are published