Reverb isn’t echo exactly and it’s not delay, even though it’s in the same family of effects. Reverb is a psycho-acoustic phenomenon. Turns out that just like our ability to tell one human face from another, our brains are really good at distinguishing the space in which sound is made. It’s this amazing ability that helps keep you safe when you’re crossing the street. How does reverb do that? Well, think about all the sounds that are being produced from the dozens of automobiles and people. As you are crossing the street, your brain can tell without you even looking, that the big truck it hears off to the left is still safely half a block away. So how does your brain do it? In a word – reverb. Sounds reflect off of surfaces and get absorbed by others. And because you have two ears, your brain can calculate not only the distance and type
of space the sound is in, but also its location and movement. Pretty important evolutionary trait for humans who hunt (and cross streets).
Three Types of Reverb
There are three ways to add reverb to a recording. You can mic an instrument so as to capture not only the direct sound, but also the sound it makes interacting with its environment. An example of this would be setting up microphones 20 feet away from a piano in a concert hall. But what if your piano lives with you in a small apartment and you want that concert hall sound? It used to be (up until the 1980s) that if you wanted to add artificial reverb to an instrument, you needed to use a plate or spring reverb unit. Basically these analog devices have a transducer and a pickup separated by either a plate or springs. When an electrical impulse (input) is picked up through the spring/plate it simulates the sound of reverb (sound bouncing off of walls). Pretty much all reverb you hear on popular recordings from the 1950s–1970s was made this way. Then in the 1980s with the invention of the microprocessor, companies like Lexicon came up with algorithms to simulate what happens to sound in different spaces – and viola! Digital reverb was born. Now we had the ability to dial in any type of space (rooms, halls, cathedrals) and start to play with the parameters of reverb. Today we can actually sample real spaces and use those as the algorithms for our reverb to re-create any space you can imagine!
Reverb Parameters
When we add more reverb to a signal, we say it’s getting “wet,” and obviously as we reduce the reverb, we say it’s getting “dry.” It used to be, that was pretty much all we could control – the wet/dry ratio. Today there are many parameters that we use to adjust reverb. The most important is RT60 time – the time it takes for the level of reverb to drop 60dB (or basically “how big the space is”). Sound travels at 1,126 feet per second. You might not know the speed of sound but don’t worry, your brain does. So if you want your piano to sound like it’s in a hall 60 feet away, the first thing you do is dial in an RT60 time of about 53ms (sound travels at 1.12 feet per millisecond so divide 60 feet by 1.12). When your brain hears that it takes 53 milliseconds for the sound of the piano to hit a surface (what are called the “early reflections”), it intuitively knows that piano is in a big space.
Re-Creating Real-World Reverb
The problem with early spring and plate reverb devices was that they started making reverb from the instant a signal was applied. But in the real world, when you hit that piano key in the concert hall, the sound takes 53 milliseconds before it hits anything and begins to travel back. Nowadays, we can simulate this effect with “Pre-Delay.” So, to recreate the feeling of more space between your initial sound sources, you would need to add more Pre-Delay. Today’s digital reverbs usually will include Reverb Damping with High- and Low-Frequency Damping parameters, which allow you to shorten the reverb decay for the frequency extremes. Using Low-Frequency Damping can help create the feeling of a larger, cavernous room, while using High-Frequency Damping can create a much warmer, intimate space. Some reverb plug-ins, like the Lexicon Pantheon, also include “Spread” and “Diffusion,” which can be used to enhance the spaciousness of your reverb. Some plug-ins also include an “Echo” section, which allows you to create echoes in your reverb, if you be so bold!
[editor’s note – join us next month as we wrap up our discussion on reverb in Part 2.]
Zac Cataldo is a musician and owner/producer at Night Train Studios, a recording studio in Westford, MA. He is also co-owner of Black Cloud Productions, a music publishing company. Reach him at [email protected].
Brent Godin is a bassist/guitarist and engineer/producer at Night Train Studios. He is also a talent scout at Black Cloud Productions. Reach him at [email protected].