Skip to main content

charliewild
8th January 2023

The science behind music production

Science and music might seem unrelated, but physics is the key to all of our favourite songs and bands
Categories:
TLDR
The science behind music production
Photo: Namroud Gorguis @ Unsplash

Science has always played a role in the development of music. For example, the science of standing waves was vital to the invention of stringed instruments, like violins and guitars.

In the age of home recording studios, digital instruments, and sound effects, science is still just as important, if not even more crucial, in the production of music.

Digital instruments

One of the most influential advancements in music production is the creation of digital synthesizers. These are digital instruments that can be programmed to sound like any instrument. They can be controlled by external keyboards, where someone plays the keys and the sound produced is as if a different instrument, like a saxophone, were playing the same note. These allow cheap access to many different sounds and tones without needing expensive instruments or the skills to play them.

How can we manipulate sound to get the same tone (also called timbre) as a real instrument? The answer lies in understanding why different instruments sound different when playing the same note. When the same note is played on a piano or a violin, for example, the sound is very different despite the note being the same. The reason can be explained using wave physics.

Picture a sound wave smoothly sweeping up and down from peak to trough, over a fixed time period, repeating itself. The length of this period determines the frequency or pitch of the sound, and the height of the peaks and troughs (the amplitude) determines its volume.

In a real instrument producing a note of specific pitch, a wave that resembles this smooth sine wave is produced. However, due to the structure of the instrument, other frequency waves are also produced and are present in the sound wave. These are much quieter, having lower amplitudes. These low amplitude additional waves have the effect of slightly modulating the shape of the sound wave while leaving the overall frequency, therefore pitch, unchanged.

Different instruments will produce different additional frequencies, therefore modulating the overall wave in a different way. This gives rise to the unique tones of instruments. In short, a C note played on the piano will sound different from a C played on the violin, because the piano produces a smoother wave, and the violin produces a much more modulated wave and thus is raspier with a richer tone.

The way synthesizers replicate these tones is by reproducing these additional quiet sound waves present in the sound from a real instrument. Sophisticated software can analyse sound samples from real instruments, using Fourier analysis to ‘understand’ which additional frequencies are present in the signal from a real instrument. Instead of just producing a simple single-frequency wave, the synthesizers can replicate the specific sound and tone of a real instrument.

Sound effects

On top of just replicating the tone of real instruments, software can add various effects to either recorded or digitally produced sounds. One of the effects that can be added is distortion.

Distortion is the crunchy, rock sound that is heard in electric guitars. Think of memorable guitar riffs from famous songs such as ‘Smoke on the Water’ (Deep Purple) or the introduction of ‘Do I Wanna Know’ (Arctic Monkeys). The loud, fuzzy effect on the guitar is distortion.

Distortion is not just limited to guitars, it can be heard anywhere. If you’re listening to music and turn up the volume too loud for your speaker to handle then the song will sound raspy and crunchy: this is the sound wave becoming distorted.

The science behind distortion is very similar to that of the different tones of sound. Essentially, it is an extreme version of the modulation of the smooth sound wave. Instead of having a few additional small amplitude waves on top of the overall sound wave, there are now many more, each with a much larger amplitude. This changes the shape of the sound wave much more drastically than in the case of the violin or other instruments. This new sound wave results in a fuzzier, distorted sound.

The reason this happens when you turn up music too loud is that the speaker cannot handle the large amplitudes of the wave when the sound is turned up, so the speaker cuts off the high-amplitude sections of the wave. This has the effect of flattening the smooth peaks of the wave. Changing the shape of the wave this much adds lots of different frequencies to the original sound wave, adding distortion. This is a bad thing in headphones and speakers but was originally how distortion was produced for guitars, by overdriving the speaker the guitar was playing out of.

Nowadays distortion can be applied digitally using the same techniques as in synthesizers: by drastically altering the shape of the wave digitally we can tailor the exact distortion applied to a sound wave.

Physics: The sound of the future (and the past)

Science has revolutionized music throughout history, from the invention of the first instruments, to now where the physics of waves has allowed the creation of digital instruments and effects. Advances in science inspire advances in creativity. Science has put music at the fingertips of millions, making music more accessible and affordable. All you need is a computer to unleash your creativity. This can be seen in the booming music industry, where anyone can create and produce music from their bedrooms and share it with the world. So, in the timeless argument of who are the most influential people in the history of music, I argue it is not the Beatles or Elvis but physicists!


More Coverage

Why are you laughing: The science of humour

While humour is an innate part of being human, dating back to ‘primate laughter’, exactly what makes something funny is still mostly unknown

In conversation with The Lion King’s Head of Masks and Puppets

The Mancunion was fortunate enough to attend an Insight Session at the Lyceum Theatre and sit down with The Lion King’s Head of Masks and Puppets Joseph Beagley to learn more about the science behind his craft

AI learns its first words (and helps explain how humans acquire language)

How do we learn to associate specific objects with specific words? A team from New York University have developed an AI ‘baby’ to help us answer this question.

Can algorithms help you live a better life?

As the term drags on and student loans dwindle, many students start to feel unmotivated and unsatisfied with their lot in life. Could computer algorithms help you get back on track?