X
X

Playing Your Song: The Evolution of Dynamic Music in Games

For more than 25 years, developers and composers have been pushing tech forward to unify gameplay and soundtrack.

There is a moment in 1991’s Monkey Island 2: LeChuck’s Revenge that is seen as a breakthrough in video game music. Upon leaving the campfire at the beginning of the game, the mighty pirate Guybrush Threepwood enters the small town of Woodtick. As the character moves from screen to screen and interacts with certain objects in the environment, the music changes seamlessly to match the actions unfolding on screen. It may have been just a few small steps for Guybrush, but it was one giant leap forwards for video game audio at the time.

Prior to this, video games had largely taken a linear approach to music, with some notable exceptions. Space Invaders (1978) would increase the tempo of the music the closer the enemy ships got to the bottom of the screen. Frogger (1981) would play a celebratory jingle whenever the player successfully crossed to the other side of the road. And Wing Commander (1990) included a reactionary score that would cycle through different sections of a track to avoid repetition. Monkey Island 2 was a much more ambitious take on dynamic music, however. The soundtrack was built using an interactive music streaming engine (iMuse for short) that hinted towards a more elegant future for video game music, one where the score would be able to react and synchronize more closely with what the player was doing.

Peter McConnell was one of the two LucasArts composers who worked on iMuse, the other being The Secret of Monkey Island composer Michael Land. The two had been tasked with developing new sound drivers for the next slate of LucasArts releases, but had decided to experiment instead with creating an interactive sound system to address their frustrations with the technology of the time.

“We were looking at this problem of ‘Hey, this music is cool, but we could actually make it so that it’s more like a pit orchestra instead of a drop the needle kind of thing,’” McConnell recalled. “Which is what all music was at that point. You play a tune. Something happens. You play another tune. And we were like, ‘Well, why not have the music part of the thing be a little more intelligent and have it respond in a musical way to the user input?’”

What made iMuse so exciting for the time was that it could add or remove instruments from the mix as conversations progressed, on top of being able to recognize what measure a track was in and what beat it was on. Depending on that information, iMuse could then select an appropriate musical fill to bridge the transition and merge separate pieces into one long performance.

“It was the idea that there should be language inside of the musical piece that gets interpreted by the ‘sound driver,’” McConnell said. “So that when it gets a sound command, it doesn’t just do something hamfisted, it does something that the piece has told it is okay.”

“This is something that we modelled twenty years later,” said composer Wilbert Roget (Call of Duty: WWII, Mortal Kombat 11), who acted as music arranger on the Monkey Island 2’s special edition at LucasArts. “We could say, ‘Well, we’re at measure 13 and we’re trying to get to measure 1 in this other piece. So, in order to end the melody in a convincing way, we’ll play the measure 13 transition.’ And at that time, it was mind-boggling tech and even to this day it’s still some of the most interesting and complex interactive music that we’ve seen in the industry.”

Following the release of Monkey Island 2, there was a notable shift in the industry towards more dynamic music in games. Some examples of this include George “The Fatman” Sanger and Dave Govett’s score for Ultima Underworld: The Stygian Abyss (1992), Chance Thomas’s orchestral score for Quest for Glory V: Dragon Fire (1998), and Grant Kirkhope’s music for the platformer and collectathon Banjo-Kazooie (1998).

“At Rare, we loved Monkey Island,” Kirkhope said. “So I was asked if I could find a way of making it work in Banjo-Kazooie. The tools were just Cubase and MIDI files. I would get a list of areas in the level that the designers thought should have a different version of the theme, then I’d allocate tracks in the MIDI file to each area. I’d then tell the programmer which tracks were for which area and we’d draw circles in the level that you can’t see. So when the character Banjo crossed over any one of those circles it triggered the music to change to the new variant.”

You can hear examples of this while exploring the hub world in Banjo-Kazooie. Whenever Banjo arrives at the entrance to a new world, the Gruntilda’s Lair theme will change its mood to fit the new area. This technique is also used elsewhere in the game to produce the underwater music effect, and to fade out instruments while soaring high above a level with Kazooie’s flying ability.

These musical approaches usually fall into two different categories: vertical and horizontal. Vertical refers specifically to the harmony of a piece, with composers being able to add or subtract layers to a track to change its mood or intensity based on in-game events.

As an example, in the first Red Dead Redemption, almost the entirety of the soundtrack is comprised of stems recorded at 130bpm in the key of A Minor, so as to allow the composer or sound designer to piece the music together in various ways. This lets them cover everything from dramatic gunfights to the lonely rides over arid plains. The composer Petri Alanko also uses a similar approach to this on cinematic games like Remedy Entertainment’s Quantum Break and Control. However, he isn’t so restrictive in terms of the tempo and key of the individual stems he uses.

“With some of my previous projects, I’ve had a massive printout of the pieces and stems on my wall, just to remind me of what is where,” Alanko said. “I also try my best to do ‘interchangeable material’; to provide stems that can be used with other stems, elsewhere in the level or project. For instance, ‘perc stem A_03’ could fit in well with ‘perc stem Z_12’ as well as ‘string_stem_low G_33.’”

In contrast to the vertical approach, horizontal refers to the melody of a piece and the timing. It’s this type of technique that can produce fun musical stingers, such as in Super Mario Galaxy as players collect a series of notes to play out a classic Super Mario track, or in Tetris Effect where every block placed on the playing field produces a small musical tone.

“With the horizontal approach, you can have like beat synchronization and say, ‘We’re going from this piece to the next, so let’s wait until the next bar or every two bars until we transition,’” Roget elaborated. “And then with the vertical, you can say, ‘Okay, let’s organize the piece into stems that can be stacked. Maybe we’re in a type of combat situation, so let’s layer in percussion or some other heavy brass or something.’”

The two techniques aren’t mutually exclusive, however. Instead, they’re often used in combination in order to create a complex tapestry of sound. In Austin Wintory’s soundtrack for Journey, for example, the game features both vertical and horizontal techniques to create a moving and emotional accompaniment to the player. The score is essentially one long theme, with different instruments representing the player character and the world around them. As the player interacts with ancient ruins and meets other players, instruments like a flute and harp are introduced into the mix as musical representations of both.

Additionally, in the climactic piece “Apotheosis,” there are sections of the track that will loop until the player gets to a particular area, such as when the character reaches a waterfall at the peak of a mountain. At this part, the music will repeat until the player ventures through the middle of the waterfall to the lake on the other side. It’s here that the first violin will be reintroduced into the score.

All of this means that individual play sessions of Journey can have slight deviations between their scores depending on the timing of events and who the player encounters on their way towards their goal.

“The issue [with dynamic scores] is reconciling the player agency with some sort of musicality,” Wintory said. “Music is by definition a linear art form, playing out over time. Great music (and great performers) leverage that to take you on emotional adventures. When that passage of time is subject to tinkering, it can destroy the storytelling instantly. My goal is always to find a way to preserve the best of what traditionally linear music can offer (particularly with regards to storytelling) while giving the player maximum agency in the process.”

Wintory points to Jessica Curry’s score for Everybody’s Gone to the Rapture as an excellent example of another score that achieves this balance. As well as more traditionally linear moments, Everybody’s Gone to the Rapture’s score features granulated material that is fed back to the player in response to in-game variables.

As former Chinese Room audio designer Adam Hay explained, “Each narrative area in Rapture (village, woods, farm, camp, estate) has its own unique bank of instruments and a basic script that defines how frequently these instruments are allowed to trigger a note. The script can have multiple states, so depending on how far through an area’s narrative you are we increase the intensity and density of the mix of procedurally-generated music by bumping the state up to the next level. It gives a subtle sense of rising tension as you uncover more of the story in each area, with more textures and elements appearing in the mix as you progress through the arc.”

The inspiration for this approach was sound designer Paul Weir’s 2011 GDC talk on the application of generative music, called “Stealing Sound.” At the time of the talk, Weir was working on a new Thief game at Eidos Montreal. Though the project would never see the light of day, some of the ideas that Weir developed for the game later found their way into his procedural work on Hello Games’ No Man’s Sky. While the post-rock group 65DaysofStatic created a traditional soundtrack for the space exploration game, most of the music in it is actually generative. These pieces are developed based on a collection of contextual rules that have been set within the game’s custom music tool, Pulse.

These rules vary depending on if the player is in space, combat, or roaming around on the surface of a planet. To explain this, Weir gave the example of a player in space deciding to look in the direction of a planet. As a result of this action, a value might change somewhere in the audio system, resulting in an arpeggio being played or an increase in the activity level of the music. But not always. This ensures the music never becomes stale or repetitive as players explore its infinite galaxy.

“The music is triggered at several different points and is assigned different levels of importance,” Weir said. “The majority of the music when exploring is generative, but at key times, such as flying into space for the first time or when you first start exploring, these events are treated as stings, same for discovering specific items or at specific narrative moments. Space stations, the galactic map, abandoned wrecks all have their own pieces. The loading music has around fifty different loops that can play, so there’s a considerable amount of music in the game.”

Adaptive music is becoming a common way of doing music in games. Ape Out, for instance, also uses a generative system to produce its soundtrack, with individual actions such as throwing or attacking an enemy contributing to the performance of an invisible jazz percussionist. The effect of this system is a frenetic soundtrack that perfectly underscores the chaotic gameplay. Meanwhile, Dan Golding’s score for the recently released Untitled Goose Game uses a stem-based system made up of six of Debussy’s Preludes chopped into sections. These can then be paired together in different combinations depending on what state the player is in as they harass the local villagers. 

From its humble beginnings as crude and compressed sounds samples, video game music has grown in complexity. But that doesn’t mean more linear approaches have overstayed their welcome. Instead, composers are constantly trying to strike a delicate balance between the two, depending on the needs of the project and their own musical intuition.

“I find what’s most important isn’t the technical side rather the principal of composers trying to effectively drive the emotional contour of the game,” Roget said. “I think that it’s great to be as knowledgeable as possible with the tech aspects to make sure that we’re taking every opportunity to be as musical as we can. But again, I feel that content is king.”

Header image credit: LucasArts

You may also like