City of Beats, and using FMod to create musical gameplay
City of Beats is a music-driven rogue-like shooter for PC and Nintendo Switch, developed by programmer/developer/designer Kai Hillenbrand and composer/audio designer Nicholas Singer. It was published by Freedom Games and released in May 2023, and was nominated for Best Audio for an Indie Game, Creative and Technical Achievement in Music, and Creative and Technical Achievement in Sound Design at the Game Audio Network Guild Awards 2024.
This is a quick summary of how we used FMod to synchronise music and gameplay, and to set up simple midi-style instruments within the engine, mostly based on the destination marker system.
FMod is commonly used as a tool for creating music that reacts to gameplay, but it can be equally powerful the other way round; by allowing a game engine to be triggered by events within the music. Destination markers are normally used to control the position of the playhead within an FMod event; but by simply placing them to align with certain notes, beats or other musical events on the timeline, and getting Unreal or Unity to listen out for those markers as callbacks, we can create synchronicity that goes beyond use of the “OnBar” or “OnBeat” functions.
There are many other ways to approach this- including audio-reactive tools that analyse the spectrum data- but this method gives us more control, accuracy and flexibility, and is easily achieved within FMod.
Driven by music
We started by assigning “A1” markers to every bass drum hit in the music, and “B1” to every snare. As long as those markers are being expressed to the engine (in our case Unreal Engine), we can use the “On Timeline Marker” function to easily set up a script or blueprint to listen out for those particular markers and react accordingly. In our case, we wanted each bass drum to trigger an enemy movement, and each snare hit to trigger their weapons. That immediately made those enemies feel integrated into the music. We also linked those markers to lights and other elements in the environment to increase that immersion.
Hand placing markers throughout the soundtrack can be laborious, but the use of copy/paste and the visual nature and flexibility of fMod Studio’s UI make it simple. To speed things up, Kai created some scripts for fMod which sped up the process, including batch renaming of the markers.
Some enemies have attacks that are more interesting than simple projectiles- the bombardiers for instance, which shoot an arcing mortar- for these we used a simple layer in the soundtrack that fades in when the enemy is present. Others have a pre-set sequence of movements which are triggered by a single marker.
Everything in harmony
Taking things a bit further, we started using markers to trigger other events within fMod. I added a layer of “harmonic” markers throughout each music track, designating numbers to different chords and key areas within the music. I then made a bunch of events with discreet parameters corresponding to those different harmonies; for instance, one event provided the enemy explosion sounds, which were satisfying chords in various keys. I placed those different chords into different parameter slots, and we could then create scripts within Unreal to read the most recent harmony marker, and then set the harmony parameter to match.
We used that technique for most sound effects in the game, which helped to create a sense of cohesion and immersion, and contributed to the game’s reward mechanics. Some were chords like in the example above, while some were collections of single notes played randomly from within a harmonic group, e.g. the health pickups.
Musical Weapons
The player’s weapons act a bit like instruments, each projectile contributing a note to a series of melodies synced to the background music. Initially we set these up as layers in the music, which fade up when the trigger was held; but that doesn’t give much control over the tail end of each note, and doesn’t sound very satisfying no matter how quickly you fade in/out and despite using in-engine reverbs etc.
So our solution was to create a ‘library’ of notes in its own event, much like the harmonies above, giving each degree of the scale its own discreet parameter slot, and creating a parameter which selects between them. I wrote a series of melodies, and translated those note values into numbers.
So, when the trigger is held, a script/blueprint in Unreal checks what the last repeater marker was. The “harmony” parameter is then set to that number, and so FMod will trigger the note contained in that column. Simple!
Tidying up
At this point we had a couple thousand destination markers within fMod and started to notice some performance hits as a result- not ideal in a game that relies on perfect audio synchrony- so decided to offload most of that data into a Json file. Any sequences of markers that occurred in a regular, steady rhythm- e.g. the player weapon melody markers and the harmonic markers- could be read as a simple string of numbers, whereas any more complicated rhythms, e.g. the drum hits/enemy movement/attack patters would remain within the fMod project. This, along with the great compression options available enabled us to run the game smoothly and in sync on all platforms including Nintendo Switch.
In Summary
FMod allowed us to experiment and to implement all sorts of adaptive audio techniques- from music that fades layers in and out depending on the player’s proximity to different areas, to the use of alternative stems and randomisation to generate variety and longevity in the soundtrack- but using the destination marker system to trigger events really opened up the possibilities. It’s an extremely powerful and flexible function that can be applied to almost any scenario or genre, whether it’s used up front as a rhythm-action component, or in the background as a more subtle tool to enhance gameplay through synchronisation.