How to make the most of Reaper as a Sound Design tool – Part 1: Getting Started

I have recently contributed to the A Sound Effect Blog with a 2 part series article on how to make the most of Reaper as a sound design tool.

The first article looks into getting started using Reaper and the initial set up. You can find it here.

The second will be up in about a week’s time and will cover more of the workflow and some good habits to be taken from the start. Keep an eye out!

ūüôā

Loudness and metering in game audio

This post is not a tutorial on loudness and metering in game audio. It is rather about sharing my findings on something I am currently researching on, hoping it can help those of you who would be in a similar position as me. I will definitely revisit this post at a later stage of my current project to share my experiences and conclusions on this info.

Since this is a work in progress, or rather a learning in progress, feel free to comment and let me know about any better/other ways to see or do these things.

I’ve been working on my current project for a few months now and, although I’ve been wondering about loudness and metering earlier in the process, the time has only recently come for me to make decisions on the matter, and hence look deeper into it.

First, I found this amazing resource which helped me understand more about all of it very quickly. This article from Stephen Schappler is a real gem and I strongly recommend you have a read. I will mention some of the things he shared in his article here, as well as develop according to my own experience.

This interview with Gary Taylor from Sony is equally¬†very instructive, going into further details about Sony’s Audio Standards Working Group¬†(ASWG) recommended specs.

 

Industry standards (or lack thereof) and game audio solutions

There are currently no standards set for loudness measurements in game audio, resulting in wide variations and discrepancies in loudness from one game to another. The differences in gaming set ups and devices also present a challenge in terms of developing those standards.

One way to start looking into this is to refer to the BS.1770 recommendations to measure loudness and true peak audio level.

To put it simply, these algorithms measure Loudness Level at three different time scales:

  • Integrated (I) – Full program length
  • Short Term (S) – 3 second window
  • Momentary (M) – 0.4 second window

What these mean for game audio will probably be different than what they mean in TV, as there is no full program length in interactive media, and 3 and 0.4 seconds may prove to be too short cuts to take any accurate measurement, again relating to the dynamic and interactive nature of the medium.

This is what Gary Taylor recommended about adapting the BS.1770 measuring terms to game audio (in this interview) :

We recommend that teams measure their titles for a minimum of 30 minutes, with no maximum, and that the parts of any titles measured should be a representative cross-section of all different parts of the title, in terms of gameplay.

As BS.1770 also indicates, it would be wise to consider the Loudness Range (LRA) and the True Peak Level. In order to do so, you would need good tools (accurate Loudness Meter) and a good environment (calibrated and controlled).

In terms of numbers, let’s look at the¬†R128 and A/85 broadcast recommendations, which we could assume would present a similar objective if working on console and PC games, where your environment and set up would be the same/similar as your TV set up.

Those recommendations are:

R128 (Europe)

  • Program level average: -23 LUFS (+/-1)
  • True peak maximum: -1 dBTP

A/85 (US)

  • Program level average: -24 LKFS (+/-2)
  • True peak maximum: -2 dBTP

 

However, these numbers may not apply to the mobile games industry, and¬†different¬†terms would need to be discussed in order to set standard portable¬†devices levels. Some work has already been done on that matter by Sony’s ASWG, who are among the first ones (if not the first) to consider standardising the game audio loudness metering process and providing recommendations. Here are their¬†internal loudness recommendations for their 1st party titles:

Sony ASWG-R001

  • Average loudness for console titles: -23 LUFS (+/-2)
  • Average loudness for portable titles: – 18 LUFS
  • True peak maximum: -1 dBTP

Gary Taylor mentioned in his interview that studios such as Media Molecule and Rockstar¬†are already conforming to¬†Sony’s¬†specs, both in terms of average loudness and dynamic range. This seems to indicate that progress is being slowly but surely made in terms of game audio loudness standardisation.

How to proceed?

The recommended process is to send the audio out from your game directly into your DAW and measure loudness with a specialised plugin. Be careful to make sure your outputs and inputs are calibrated and that the signal remains 1:1 across the chain.

Gary Taylor’s plugin recommendations to measure loudness:

As far as analysis tools, I personally have yet to find anything close to the Flux Pure Analyzer application for measuring loudness, spectral analysis, true peak, dynamic range and other visualisation tools. As far as loudness metering generally, Dolby Media Meter 2, Nugen VizLM, Waves WLM, and Steinberg SLM-128 (free to Nuendo and Cubase users) are all very good.

I have yet to experiment with those plugins and decide on my favorite tools. I happen to have the Waves WLM so will give that a try first, and plan to compare with the demo version of Nugen VizLM and see if I want to buy. I will update this article with feedback from my experience when ready.

Wwise and FMOD now also support BS.1770 metering, which is extremely convenient for metering directly within the audio engine.

In Fabric, there are Volume Meter and Loudness Meter Components which allow you to meter one specific Group Component. You could for instance apply those to a Master Group Component to monitor signals of the overall game.

loudnessmeter

 

However, I think that despite using these tools within the audio engines, it is worth measuring the direct output of your game directly from your DAW with the help of a mastering plugin.¬†I see this as a way to ‘double-check’, I’m a big fan of¬†making sure everything works as it is meant to, and listening to the absolute final end result of the product seems like a valid way to do this.

Finally, I unfortunately don’t have¬†the luxury of working in a fully calibrated and controlled studio environment. If you are in a similar position as me, I’d strongly recommend considering renting a studio space towards the final stages of the game production to perform some more in depth mixing and metering.

I hope this was useful even though this info is based mostly on research rather than pure experience. I will most definitely revisit this topic once my remaining questions are answered ūüôā

 

Additional documentation:

 

Audio processing using MaxMSP

If you follow me on twitter, you will have seen a few recent tweets about my latest experiments with Sci Fi bleeps and bloops.

I created a MaxMSP patch that allows me to process sound files in such a way that the original file is nearly unidentifiable, and the results sound nicely tech and Sci Fi.

My process there was that over time, I created a few simple individual patches performing this sort of processing:

  • Phaser+Delay
  • Time Stretcher
  • Granulator
  • Phaser+Phaseshift
  • Ring Modulator
  • Phasor+Pitch Shift

I decided to assemble those patches together in such a way that I could play with multiple parameters and multiple sounds at the same time.

In order to do so, I have mapped the various values and parameters of my patch to a midi controller [KORG nanoKONTROL2], and selected a few sounds a know work well with the different items of the patch to be chosen from a dropdown menu.

This is what the patch looks like:

scifipatch02.JPG

All the different ‘instruments’ are contained in subpatches. They are all quite simple but create interestingly complex results when put together.

The subpatches:

scifipatch03

Organised nicely in Presentation Mode, I can interact with the different values with my midi controller:

scifipatch01.JPG

The mapping system:

scifipatch04.JPG

I can then record the result to a wav file on disk, which I am free to edit in Reaper afterwards, selecting the nice bits and making cool sounds effects with these original sources.

Record to file:

scifipatch05

This process can be quite infinite as I can then feed the processed sound back to the patch and see what comes out of it.

Here is a little demo of the patch and its ‘instruments’:

 

And some bleeps and bloops I made using this patch:

 

You can visit the Experiments¬†page to hear more tracks ūüôā

 

 

Game Audio Asset Naming and Organisation

 

Whether you are working on audio for an Indie or a AAA title, chances are you will have to deal with an important amount of assets,which will need to be carefully named and organised.

A clear terminology, classification and organisation will prove to be crucial not only for yourself and find your way around your own work, but for your team members, whether part of the audio department or the front end team helping you implement your sounds into the game.

I would like to share my way of keeping things neat and organised, in the hope that it will help the less experimented among you start off on the right foot. I don’t think there is only one way to do this though, and those of you who have a bit of experience might have a system that already works, and that’s perfectly fine.

I will go over creating a Game Audio Design Document and dividing it into coherent categories and subcategories, using a terminology that makes sense, event naming practices, and folder organisation (for sound files and DAW sessions) on your computer/shared drive.

Game Audio Design Document

First, what is an Audio Design Document? In a few words, it is a massive list of all the sounds in your game. Organised according to the various sections of the game, it is where you list all the events by their name, assign priorities, update their design and implementation status, and note descriptions and comments.

The exact layout of the columns and category divisions may very well vary according to the project you are currently working on, but here is what I suggest.

COLUMN 1: Scene or Sequence in your game engine (very generic)

COLUMN 2: Category (for example the object type/space)

COLUMN 3: Subcategory (for example the action type/more specific space)

COLUMN 4: Event name (exactly as it will be used in the game engine)

COLUMN 5: Description (add more details about what this event refers to)

COLUMN 6: Notes (for instance does this sound loop?)

COLUMN 7: 3D positioning (which will affect the way the event is implemented in game)

COLUMNS 8-7-9: Priority, Design, Integration (to color code the status)

COLUMN 10: Comments (so that comments and reviews can be noted and shared)
It would look something like this:

AudioDesignDoc-Ex.jpg

 

This document is shared among the members of the audio team so that everyone can refer to it to know about any details of any asset. You could even have a ‚Äėname‚Äô or ‚Äėwho?‚Äô column to indicate who is responsible for this specific asset if working in a large audio team.

It is also shared across the art team if the art director is your line manager, and across any member of the front end team involved in audio implementation.

This list may also not be the only ‚Äėsheet‚Äô of the Audio Design Document (if you are working in Google Sheets, or equivalent in another medium). A few other sheets could involve a document created especially for the music assets, another for bugs or requests to keep track of, another for an Audio Roadmap, and so on. Basically, it is a single document to which all team members can refer in order to keep up to date with the audio development process. You can equally add anything that has to do with design decision, references, vision, etc.

While big companies may very well have their own system in place, I find this type of docs to be especially useful when working in smaller companies where such a pipeline has not yet been established.

I’d like to point out as well that, in the creation of such a document, it is important to consider that you will need to remain flexible throughout the development process. Especially if you join the project at an early stage, where sections/names/terminology in the game are bound to change. Throughout those changes, it is important to update the doc regularly and remain organised, otherwise it can rapidly become quite chaotic.

Terminology

In terms of terminology, this is again something that can be done in many ways, but I’d say that one of the most important things is that, once you’ve decided on a certain terminology, remain coherent with it. And be careful to name the events in your audio engine exactly the way you named them in your design document. Otherwise you will very rapidly get confused between all those similarly named versions of a same event, and won’t know which one is the correct one to use.

What I like to do is, first, no capital letters, all minuscules, so that it doesn’t get confusing if events need to get referred to in the code. Programmers don’t need to ask themselves where were those capital letters, which may seem like a small thing but when there are 200+ events, it is appreciated.

Then there is the matter of the underscore ‚Äė _ ‚Äė or the slashes ‚Äė / ‚Äė. ¬†That may depend on the audio engine and/or game engine you are using. For instance, using Fabric in Unity, all my events are named with slashes for the simple reason that it automatically divides them into categories and subcategories in all dropdown menus in Unity. This becomes very handy when dealing with multiple tools and hundreds of events.

Then the organisation of your audio design document would pretty much tell you how to name your event. For instance:

category_subcategory_description_number  (a number may not always be required)

base_innerbase_overall_ambience

character1_footsteps_grass_01

etc

If you dislike the long names you can find abbreviations, such as:

ch1_fs_gr_01

I personally find they can become quite confusing when sharing files, but if you do want to use those, simply remember to be clear on what they mean, for instance by writing their abbreviated and full name in the doc, and make sure that there is no confusion when multiple team members are working with those assets.

Folder organisation

Whether you are working as a one person department on your own machine or you are sharing a repository for all the audio assets, a clear way of organising these will be crucial. When working on a project of a certain scale (which doesn’t need to be huge), you will rapidly get dozens of GB of DAW sessions, video exports, and files of previous or current versions.

I suggest you separate your directories for your sound files, DAW sessions and other resources. Your sound files directory should be organised in the same way you organised your Audio Design Document. This way, it is easy to know exactly where to find sound(s) constituting specific events.

I also suggest that you have a different (yet similar) directory for previous versions. You may call it ‚ÄėPreviousVersions‚Äô or something equivalent, and have an identical hierarchy as the ‚Äėcurrent version‚Äô one. This is so that, if you need to go back to an older version, you know exactly where to find it, and can access it quickly. You can name those versions by number (keep the same terminology, and add a V01, V02 at then end).

Finally for your DAW sessions, you may decide to go for something a little bit different in terms of hierarchy, but I find that maintaining a similar order is very useful for self organisation and be able to quickly go back to sessions you may not have touched in a while.

I also highly recommend that you save your session as, in order to back them up, but also anytime you make changes to a version of a sound. First, corrupted sessions do happen, so you’ll be extremely happy when it happens and you haven’t lost weeks of work, but also if your manager prefers an earlier version of your sound, but with some modification, you can easily go back to exactly that version, and start again from there, while still keeping the latest version intact.

So, if my asset hierarchy and division in my Audio Design Document looks like the one in the image above, my folder hierarchy would look something like this:

 

And finally you can create a folder for Video Exports for instance, and have your video screencaptures there, again organised in coherent folders. The principle will remain the same for any other resources you may have.

I hope this was helpful, happy organisation ūüôā

 

 

 

 

Getting started in Game Audio

This post is for those of you who are passionate about sound, and are wondering how to become a sound designer for videogames, where to start, how to enter the industry, what software and tools you need to know, who to talk to, etc.

I get these kinds of questions a lot, and although there is no magic recipe, no step by step instructions that will guarantee you a successful career in videogames, there are some things that are useful to know, and that can help you develop an attractive curriculum.

What equipment to use and/or start with?

No two sound designers will use the same equipment, but I can tell you a bit about the type of equipment you would need and the workflow. The info that I give here is pretty much minimal requirement. You can most certainly take this much further, but here is what I consider essential, in terms of hardware and software.

The hardware concerns equipment you need in order to record your own sounds:

  • microphone (and xlr cable) and/or portable recorder
  • audio interface
  • decent computer
  • good headphones

And the software, which you also need to record, but also to edit and mix:

  • a DAW (Digital Audio Workstation)

Then, when working on an actual game project, you’ll need to implement your sounds into the game. The type of software needed is called audio middleware, and is what will communicate to the game engine and act as a bridge between the audio integration and the game events. Some large companies use their own in-house audio middleware (and game engine), but I’m not going to get into this. On the market, whether your game is made with¬†Unity, Unreal or any other game engine, there are a few options in terms of audio middleware, which are usually compatible with any of the game engines (although not always).¬†Three of them are worth mentioning: Wwise, FMOD and Fabric.

In my opinion, the best one out there is by far Wwise (read the Audio Middleware Comparison post to understand why). If you are working on a commercial title you need to consider licenses, but they usually all have some sort of deal (if not free) for Indie titles, students, or simply to use on non-commercial projects. This middleware is what gives a lot of creative freedom in the interactive design and integration.

To know more about audio integration, a good way to get introduced to the logic behind it is to watch tutorials, such as the Wwise tutorials. The advanced ones can be overwhelming, but the overview ones will be very useful in getting a better understanding of audio integration and how to design sounds with that kind of logic in mind.

You also need Sound Libraries. They are part of the workflow. Especially when working with low budgets and tight schedules, it can be challenging to record all the sounds you need yourself. Using sound banks is a good way to start, starting with good quality sound files and familiarising yourself with the editing process, which is one of the most creative parts of the design.

Be careful though, I strongly suggest to never use a sound directly taken from a sound library, but rather to use that and transform it, process it, layer it with other sounds in order to create your own assets. The reason is that those sounds are recognisable, and it reflects badly on the quality of the game and its originality if the audio content is not unique.

Some of the good and affordable sound libraries out there include: Boom, Blastwave, Soundsnap, and many, many more, which are easy to find with a bit of research.

In terms of Digital Audio Workstation, my favorite is by far Reaper. It is very powerful and the license costs barely anything, as opposed to its competitors. Some would recommend Nuendo, Cubase, ProTools, Logic, etc. These are all professional DAWs and will work nicely for sound design. Which one to opt for is mostly a matter of habits and the type of workflow that suits you best (and the budget you have..).

An audio interface will help your computer deal with DAW sessions heavy in effects and plugins, but you could do without for a while if you are just starting and not recording yet. There are some very decently priced entry level audio interfaces from Steinberg (UR22), Focusrite (Scarlet 2i2 or 2i4), and many more. Once you get more serious and do a lot of recording, it might be worth investing in a good audio interface with quality preamps.

If low on finances, you can start recording with a portable recorder instead of getting expensive microphones and audio interface.  I own a SONY PCM m10 and it is a very reliable and useful piece of equipment. Other equivalents such as the Zoom are also worth looking into. You can visit the Gear section of this blog to know more about what kind of equipment I use.

Game audio designing tricks

  • Variety

In game audio, you always want to avoid repetition, since hearing the same sounds over and over again, regardless of the quality of these sounds, will most certainly result in the player muting the audio. One way to create variety in game music is to compose a series of music segments that will play in sequence, that could also be layered together in a generative way.

For instance, you could have one loop of music that would serve as a ‘basic layer’, on top of which¬†you could have music stingers or cues (with a few variations for each of them). The possibilities ¬†for music integration are endless. One of the key tricks to game music is¬†to integrate the segments in such a way that the music is generative both horizontally and vertically. What this means is that, for instance, instead of having a single basic music layer which loops, imagine this loop actually being made of a few segments which can success each other in any order, or according to set conditions. This is your horizontal generative music. Then, at any moment (or rather depending on your metric and bars and set conditions), music segments and stingers (of which you would have a few variations) are layered additively to the ongoing basic layer. This is your vertical generative music.

In¬†terms of sound effects, the key is to have more than one single sound for one game event. For instance, if a weapon is fired, you would have at least 3 (to put a number on it, but ideally 5 and over) variations of this specific weapon, to be triggered¬†randomly¬†every time it is fired. This avoids being annoyed by hearing the same sound over and over again.¬† That’s variation in its simplest form, but you could also divide your weapon fire sound into 3 or even 4 parts (trigger, fire layer 1, fire layer 2, shell falling), and integrate these sounds (each of them with variations) in such a way that they could combine randomly, resulting in almost never hearing the exact same combination in game. The audio middleware (such as Wwise) would let you do that. It would also provide ‘randomisers’ on pitch, volume and other DSP effects so that you can create even more variations out of the sounds you already have.

  • Coherence, unity, identity

When¬†you design sounds for a game, you need to consider a certain idea of ‘sonic identity’. I suppose you could say the same for other media, but I find this to be especially relevant in games, since they are made of various sections, which the player can visit at anytime, from anywhere. ¬†A coherence and sonic identity is what will make your audio stand out. This can be achieved through designing, editing, processing and mixing techniques.

A good example of a game feature an amazing sonic identity is LIMBO.  The sound integration is seamless, the whole atmosphere of the game is glued together with the sound being so coherent with itself and with the environment. A style has been decided on and has been successfully explored and maintained throughout the game.

How to get better at creating music/soundscapes for games?

Play a lot of games and listen. Try to notice what sort of game parameters affect the music (danger, discoveries, success/failure, etc etc). If you are not currently working on a game, imagine scenarios:

From¬†the start music, you can either go to level 2 or die (your segments and transitions will need to be able to play seamlessly no matter the direction), the music on level 2 will be different, then you can go to level 3 or die, same principle. Then on top of this you could have music stingers for if the player picks up something, or if an enemy is approaching. You could have a ‘stress’ or ‘combat’ layer that would blend with or replace the original music. There are plenty of possibilities which can get more and more complex. It is a good exercise to go through the entire process, even with a hypothetical game.

You could also start from an existing game, analyse it, find the patterns and game parameters and re-do some music for it. Test it out in Wwise. Then it’s all about thinking outside the box, being creative and imagining ways to implement audio in a unique and original way.

Essential reads to learn sound design techniques

The Sound Effects Bible РRic Viers

The Foley Grail РVanessa Theme Ament

Game Sound: An Introduction to the History, Theory and Practice of Video Game Music and Sound Design РKaren Collins

Getting into the industry and networking

Getting into the industry is the hard part. There are many talented people, for very few positions. This means that on top of your own skills, you’ll need to be very proactive in your hunt for projects. Work with cinema and game students in order to create a portfolio. Re-design sound over gameplay videos and cinematics. Look for Kickstarter projects and offer your services.

It involves a lot of hard work at first, but getting a decent portfolio is the first step towards a serious career plan.

Online networking is a good way to make yourself aware of the latest industry events, to which you should attend as much as possible, make yourself known and make sure you have something to show when asked. An online portfolio is one efficient way to do this.

 

In short, networking, practicing your sound design skills by re-designing sound on existing videos, collaborations with students and Kickstarters, being nice and social, and finally being proactive and organised are some of the helpful actions you can take if you want to be a game audio designer.

I hope this is helpful to some of you. Start by reading a lot about it and watch tutorials. Google is your friend. And play games!

 

Game audio – an introduction

Hi!

This is my very fist post on this blog, so I thought I’d give a little intro on what¬†my job is about.

On 8 March 2016, International Women’s Day, I had the amazing opportunity to speak at¬†a conference about diversity in games in Dublin, organised by Coding Grace, sponsored by Digit Games. [https://www.codinggrace.com/events/iwd-diversity-games/63/].

The event was very successful and the speakers were captivating, all women in the games industry, from various departments:

  • Jen Carey (QA, development)
  • Tracey McCabe (development)
  • Jas Panaich (Live Ops)
  • Charlene Putney (Writer)
  • Elaine Reynolds (development, management)
  • and myself – audio

We all had approximately 10 minutes on stage, so it had to be short and sweet. I decided to introduce the audience to game audio and the kind of workflow it involves.

I thought it would be a good idea to give a little summary of my talk here, giving an actual introduction on game audio.

Here it is.

 

Slide1

I started by playing some sound, I suggest you do the same and listen to this short 1 minute excerpt before reading further.

What you’ve just heard is a soundscape, an ambiance in an imaginary game, that transformed over time according to a game parameter. In this case, this parameter could have been for instance proximity to an enemy, or danger, as the player progresses into the game.

In any case, what happened there, was sound providing information to the player about the game and what was happening.

And that’s basically the role of the game sound designer. It’s to find ways to communicate information about the game to the player, in a way that is not only¬†accompanying the visuals, but enriching them and adding to the gaming experience itself.

When this is done well, hopefully, this results in the player not wanting to play the game without sound.

There are tons of examples of games with fantastic audio out there.

I’m thinking for instance of

Journey (ThatGameCompany)

Limbo (Playdead)

Sword and Sorcery (Superbrothers)

Transistor (Supergiant Games)

Rayman Legends (Ubisoft)

and the old classics…

The Legend of Zelda: Ocarina of Time

Super Mario World

etc, etc, etc, etc………..

Slide2.JPG

All of these games, and many more, I’d personally rather¬†not play¬†at all than play without sound.

I am sure that, if you have ever played Journey, you would agree that playing this game without the soundtrack would not be as good an experience.

My point here is that those games with great audio are acclaimed for it. Even winning awards! So, to the developers out there who think audio is secondary or not as important as graphics (yes, they exist), you’d better¬†reconsider, sound might just be what will make your game stand out! There, I said it.

I’d even go further than that.

To me, playing a game without sound, or with such bad sound that it is ”better” muted (thanks to low budgets and bad decisions), is just as disappointing as¬†playing a broken game or a game with missing assets.

I remember playing Fallout 4, and suddenly there was a missing asset, a whole house disappeared, but only the exterior frame, you could still see the furniture inside. There were no walls, no structure. I thought ‘this is a pretty accurate analogy of¬†what bad game audio feels like – broken‘.

I knew something was¬†meant to be there, it didn’t¬†make sense without it, and yet I could still play the game without it. Same with missing textures, or missing VFX. ¬†But no developer would compromise on ¬†VFX, yet somehow audio doesn’t get the same treatment.

Working in game audio (especially mobile games), you’ll most certainly find yourself in a position where you’ll have to¬†fight the statistics. Some will remind you¬†that¬†a certain amount of players play without sound, hence it makes sense to make audio a second priority. To that, I only want to say this : relying on these statistics means perpetuating them.

So, some gamers play without sound? The way I see it, it’s either because your¬†environment doesn’t allow you to play with sound, for instance you are playing on mobile while commuting or something, OR the sound is so bad you’d rather mute it.

Buuuuut, what happens when you have something like this? (Paired with good audio, obviously).

deadspace_ios_headphones.png

This is from Dead Space. They won a BAFTA Games Award for Use of Audio, Original Score, and Best Sound Design in 2008 and 2009.

My point is, if you aim for quality, you have much better chances of achieving it than if you don’t. That sounds obvious doesn’t it? Not to everyone apparently. Spread the word.

Ok, enough with audio propaganda and let’s move on.

What makes good game audio then?

Slide3.JPG

There are some obvious tricks to that, and some less obvious ones.

The obvious ones include a¬†good production chain : this involves recording, editing, processing and mixing. But that’s true for any audiovisual production (films, etc.)

The less obvious ones concern games especially, and that is what my job is all about : audio integration.

Among the key elements of game audio integration, some of the most important ones certainly are Variety and Variability as well as Interactivity and Adaptability.

Variety is about having as little repetition as possible. This means designing multiple variations for one single sound event, or even designing so that different elements composing one sound events will be rendered in layers and combined in game in such a way that you would never really hear the same thing twice. And variability is, similarly, about finding ways to avoid sounds of similar nature being heard identically over and over again, for instance by randomizing volume, pitch, or applying other DSP effects.

Interactivity is the responsiveness to the player’s actions, and adaptability the one to the game world’s developments, outside of the player’s control. So either I, as the player, pick up something (for example), which results in some sound cue informing me of the consequence of my action, or the game itself changes, like enemies approaching, or¬†the time is progressing¬†from daytime to nighttime, which would result in a change in ambient sound and/or music.

Very often, the key to accomplish all of this will be to integrate the game audio into different layers.

Slide4.JPG

For instance, in the audio example you have listened to earlier, there were many different layers, all rendered separately so that their combination in-game would result in something different every time it is triggered, but also evolve in an unpredictable and generative way over time. This avoids the feeling of repetition you would get with one simple loop playing.

What this means, is that game audio is not only about design, but also, and maybe mostly, about the integration of the sound into the game.

In order to do that well, it takes a good amount of creativity as well as problem solving skills, but also good tools.

Slide5.JPG

Audio middleware such as the ones shown here is crucial for game audio integration. It allows the sound designer to tie the sound events to the game events, and trigger specific and elaborate audio behaviors according to any game parameter or even. It is also vital to perform in-game mixing and profiling. And this is where it all gets really creative.

And finally, with lots of teamwork (working closely with programmers), and surely lots of fun, this hopefully results in great game audio that will enhance the gaming experience.

There, I hope this was useful to some of you. Or simply interesting. This was an introduction on game audio, a very generic overview of what I do, but you will find on this blog some more specific tips and tricks, advice, tutorials, equipment chats and plenty more.

Please don’t hesitate to get in touch if you have any questions!

 

email : anne.sophie@digitgaming.com

LinkedIn : https://ie.linkedin.com/in/anne-sophie-mongeau-50483185

Twitter: @annesoaudio