top of page

From Linear to Adaptive - How a Film Composer Fell in Love with Adaptive Music



From Designing Music Now

Introduction


At the Game Developers Conference (GDC) in San Francisco, I was introduced to the adaptive music tool for game composers, Elias.  They showed me the newly introduced MIDI capabilities, and I began to think about how it compared to other adaptive music systems I had learned, such as FMOD and WWISE.  I was surprised to see a software 100% made for composers and had to check it out.  I’m a classically trained film composer, who recently entered the game industry.  The biggest struggle for a composer, whether he/she writes for film or games, is to figure out moments and moods and how to best portray them in music. Game composers have had to improvise and figure ways or preventing their tracks from sounding boring and repetitive. Middleware such as FMOD and WWISE were a great start down this path, but Elias seems to me to be the evolution of adaptive game music. In this first article in a series on adaptive music, I will explore Elias and my first impressions of it.


 


A Little Background


It was not until earlier this year that I was hired to score a video-game; the Zombie Hyper Reality Escape Room “Deadwood Mansion” by Glostation. I had a blast composing its three cues of about one minute each, with three layers and eight stingers. It was an eerie and ominous soundtrack with pads and sound design. I then started thinking about how different composing for Film and Games actually was.


Deadwood Mansion


Music has been my life since I was seven. A Brazilian boy that loved music and sang, studied to become an Orchestra conductor and ended up getting his Bachelor Degree in San Francisco, California in Music and Sound for Visual Media. It was by meeting students from multiple departments at the University that I had opportunities to work for projects such as “Scaredy Bat” by Greg Perkins, scoring for Cannes and Tribeca Short Films like “Curpigeon” by Dmitry Milkin and my first over-twenty-minute visually-stunning short film “The Colors of Hope and Wonder” by Juan Diego Escobar Alzate.



I also worked at Strawberry Hill Music, a studio that gave me a lot of experience on scoring for linear media, including by orchestrating and working with composer Raj Ramayya on multiple projects, such as Canadian Production feature “Chokeslam” and doing voice-over design for the mobile tower defense game “Realm Defense.”



 

Film X Game Scoring Struggles


Film composers struggle with a few things. The first struggle is finding the right instrumentation and genre that will fit the mood and characters of the movie. You know, if John Williams had scored the Imperial March using a Glockenspiel and a Flute, Darth Vader would probably not be so scary. Then, the second struggle is how to portray the director’s ideas that were presented to them both directly and indirectly. While some director are really good at telling the composer what they want to hear on a specific scene or theme, others speak in abstract ways, like “Can you make this scene a little more blue and with a taste of sugar?” The third struggle is to place the cues on important moments, grow, reach an accent and come back down again. Probably the hardest of them all, because the composer must read between the lines and get a sense of the full arc; it is like telling the same story through music and adding it to the visuals. And lastly, the final struggle comes in the form of self-criticism; we never think the music is good enough. Of course, any artists struggle with this one; it is part of who we are and if we do not have it, it never pushes us to create better and better art.



Myself being firstly a film composer, I know those struggles all too well. However, when I composed the music for the video-game “Deadwood Mansion,” I found that the third struggle is a little different. While the film composer has to look at a scene and think of an arc that has a beginning, middle and end, the game composer has to think about ALL possibilities at once. When you think of Jack Wall’s score for Mass Effect 2, you realize he had to think of moods for scenes, such as romance, action, excitement and others, but he also had to think about layers and how to make the music increase and decrease intensity. Cutscenes are more like films, but an in-game cue requires the composer to think about how the score will interact depending on what the player might do and what the character might be thinking. Whether it was by using pulsing synths as a solid base first layer, adding pads as a second layer and solo instruments for the third layer, Jack still had to think about the implementation and when those layers would come in and how they would fade in and out. This is a struggle that I am sure all game composers go through and it is possibly the hardest of all four.


 

The Perfect Loop


Another important part of being a game composer is creating the perfect loop, which is not always the easier task. Sometimes a loop tail simply does not fit the loop head, even though you might be using the same instrument, the same time signature and the same note; sometimes you simply feel like you want to write a cue in 17/8. And what I figured was that synthesizers, mostly pads and drones, can be your best friend and worst enemies at this stage.

When scoring the zombie-game, I had a few pads I created that had long tails and others pulsing synths that would keep going after the loop was over. These were my worst enemies, because the moment I cut the loop, they would not blend with the beginning of that segment. And another thing I realized was how much of the issue was because I had more than one track together and if some of those loops were to cut before and others after the end of the segment, they would blend perfectly. But as I was using FMOD, I had to blend instruments in less layers and had to create perfect loops of the exact same size, or else they would not fit. Stingers were my saviors as always.

I ended up figuring out how to cut some synths tails and match their sounds to the beginning of the loops. Elizabeth Hannan explains this in a simple way, by comparing a loop to a color wheel in the article “Creating Seamless Loops,” where she picks the analogy of colors as the position of sound waves and proving that to create the perfect and seamless loop, “the end of the segment needs to perfectly match up with the beginning of the segment.”

But there was still one thing that bothered me. If I could have separate tracks for two of my synths and they were to cut in different moments, the blend of the loop would be perfect, without having to work so hard to fitting sound waves, but somehow there was no way of implementing that on a game that I could think of (maybe through some champion scripting skills); not until I found out about Elias.


 

A New Way of Thinking “Adaptive” with Elias Studio 3


Imagine being in an island alone. You’re given a small hut, a coconut tree and a small stone blade. For days, all you are able to do is eat coconuts and drink their water. Then, out of a sudden, a book falls from the sky, telling you how to build a house out of clay, how to start a fire with rocks and how to build a fishing spear and hunting traps. It might not be the life you’re used to of having a house, food at the grocery store and fire on a gas fireplace that lights up with the touch of a button, but with that book you can improve your life and probably survive for longer. Then, a few days later, a second book falls from the sky, telling you how to improve your house, growing your own vegetables and building fishnets that will get you more fish than a simple spear. If these books keep falling, maybe you’ll get your perfect life back.

I see the beginning of the analogy as how composers started to develop music for games going through the 80s, with MIDI instruments, implementing it directly to the game, with little help tools and having to improvise constantly and not always succeeding. They definitely did with game soundtracks in the 90s, such as “Zelda: Ocarina of Time,” “Pokemon Red and Blue” and “Castlevania: Symphony of the Night.” However, after middleware such as FMOD and WWISE came along, it was as if our first book fell from the sky and gave game composers tools that made the workflow easier and the process less traumatic.


Now, the second book fell from the sky and it is called Elias Studio 3.



The interface for its new version, Elias Studio 3.  Looking through reviews and tutorials of its predecessors was enough to get me interested. I decided to give it a shot about a month ago and I can for sure say that I am in love with it. Not only because it allows you to create more layers, on as many tracks as you want (Woodwinds, Brass, Piano, Ukulele, etc.), but it also allows you to implement it by choosing the right stingers, adding multiple versions of a layer to create more variety, adding parts of loop that can be reused and even more important, they brought MIDI BACK!



The agility settings are a way of choosing when a layer/level will change on a track: per bar or even customized beats. So, if you want your piano to only start on Bar 3, Beat 2, if you change to the parameter to a new layer/level, the piano will only start on Bar3, Beat2.


As for the use of parts of a loop, I think it is very useful to save space. If you have a ten-bar Bass Drum loop, but it repeats every two bars, you can simply bounce the two bars and loop it inside Elias. And that can be used to save space for other tracks and levels as well, which saves a lot of the space audio files can take.


Now, as for the MIDI part, I know most composers would think of it as, “Oh, unless I want that 8bit sound or a Vangelis-like synthesizer I won’t need MIDI in my track, thank you,” but imagine being able to add thirty tracks, ten layers/levels, have high quality samples in it and still have a low data usage! With audio, even using compressed files, if you add too many layers and tracks, the cue can turn out to be quite big on data usage, but using MIDI, the files are very small and you can have the database of Elias sample libraries, which after composing thirty cues for a game, might turn a soundtrack of 3gb of audio into less than 10mb of MIDI!! Of course you have to count the space the sample libraries will take, but it will still be less than using a complete audio soundtrack.


From my DAW to Elias and how I saw the magic happen.

When I decided to experiment with Elias, I wrote a 1:51 minute cue called “Journey Through the Edge of the World” and then I chopped it into layers/levels and tracks to import it to Elias.

It took me a little bit and watching a few tutorials to understand how the software worked, but I got a hold of it and below you can check out how the cue was before in my DAW (I’m using Logic Pro X) and how it sounded in Elias by moving parameters and adding stingers.





See the difference? I know you might be thinking you can do all of that with FMOD or Wwise, but you cannot. Not with these many tracks, the agility settings and without going into the scripting board. Elias was not made to substitute your current middleware software, but to assist you getting your tracks into them in a much better and more organic way, without having to repeat the same thing a hundred times and get a player bored.


 

The MIDI setup


I did have some issues with setting up the MIDI to work with Elias, but after checking how they did it in their Demo session, I figured it out. And I thought it was a good idea to share it with you. Below you can see my process on getting the MIDI to play in Elias as I know other composers might have a hard time figuring it out like I did.



Please note: you can create MIDI in whatever DAW you use, however, if you want to preview how it will sound using Elias’ sample libraries, you will need a .sfz instrument player. The best choice is to download Elias Sampler for free when you get the package and add it to your DAW. Then you can compose using their instruments and when you import the .mid files to Elias, they will play as they did in your DAW.

First, select “Generators” under the bottom menu. When you’re there, you can add new generators, or choose the ones in your session. In this case I used the Nylon Guitar (Renaxxance).



Selecting on the bottom left menu, you can find the Patches to add. You can add one patch, or all patches for a specific library.



You can select a specific MIDI channel for each patch or simply select Omni and mix them all in one sound. If you do that, I would advise on mixing the volumes properly. Then go back to your “Loop Tracks” section and select the track to add your MIDI generator to.



And on the bottom left menu again, you will find the details, such as agility, fades, levels, progression and most important, the Generator and the MIDI channel you selected.



Because the product is still in Beta, Elias is constantly updating and fixing bugs, but if you want to get ahead of the game and learn the software before everyone else, I would hardly encourage you to. They have a trial period of 90 days right now for whoever signs up for their subscription package.


 

Conclusion


To sum up, I’m a classically trained film composer, who recently entered the game industry and like every composer had and still have my struggles. The biggest struggle for a composer, whether he writes for Film or Games, is to figure out moments, moods and how to portray them. Game composers have had to improvise and figure ways of not getting their tracks to sound boring and repetitive and middleware such as FMOD and Wwise were like gifts from the Programming-God to make their workflow easier and more adaptive. However, after trying Elias for about a month and getting to know the route they are heading to, I encourage ALL game composers to try it out. Elias is not really a competitor to other middleware solutions, but more like a middleware companion. Its features are unique and open an even bigger range of possibilities for us to make our tracks more diverse and interesting.

I would not be surprised if film composers try Elias and like it so much that they figure ways of using it for Films and linear media as well. Who knows? There are so many variables…


 

This Blog in other sources:




183 views0 comments
bottom of page