Wednesday 21 November 2012

Innovating audio environments in games

Many professional sound designers feel that the future of game audio lies within the sophistication of environmental audio. The reverb of sound within games has increased in sophistication since the early days of EAX. 'Environmental Audio Extensions' (EAX) was created to enable a more realistic and accurate audio environment and was implemented in games such as Doom 3 and Prey. More recently the attempt to produce refined audio systems for replicating environmental physics have taken center stage. 

One way that game developers are approaching reverb within games is through the application of single preset algorithms to a subset of the sound mix. This has been further developed by creating reverb regions that will call different reverb presets based on the area the player is currently located. Resulting in the reverb changing based on predetermined locations that used predefined reverb settings. So what game developers are now trying to achieve is a sound system in virtual world that replicates the same physics as the real world.  (Damian Kastbauer) http://www.gamasutra.com/view/feature/132645/the_next_big_steps_in_game_sound_.php?print=1

How can reverb be improved within games?

One of the solutions for improved reverb of game audio is for the game engine to calculate the reverb of a sound in the game at realtime. This could be achieved through the calculation of geometry at the time a sound is played or through the use of reverb convolution.

Reverb convolution is a process for digitally simulating the reverberation of a physical or virtual space. This is determined by a mathematical convolution operation and will use a pre-recorded audio sample of the impulse response of the space being modeled.

Simon Ashby, who founded and VP of strategy at AudioKinetic believes that convolution reverb is certainly the most appropriate way for reproducing realistic environmental acoustics. He explains that one of the reasons developers may avoid this method is due to "the time and expertise to code such advanced DSP effects." Another reason he provides relates to the technology of our time and that "convolution reverbs consume a lot of runtime memory and CPU resources." http://www.develop-online.net/features/1208/Optimising-Convolution-Reverb

One company that is attempting to improve the possibility of more realistic envirnomental reverb is AudioKinetic, who have created a convolution reverb that adjusts memory and CPU usuage based on available resources, while minimising the impact on reverb quality.

Simon Ashby explains the two approaches to optimise runtime performance, "time-domain truncation and frequency-domain truncation."http://www.develop-online.net/features/1208/Optimising-Convolution-Reverb

Time-Domain Truncation can be achieved by reducing the length of the IR. Ashby's says, "A good approach to shorten the IR length is to determine the noise floor level of the scene where the IR will be used and then reduce the IR end time to the point where the reverb tailgate artifact is inaudible."http://www.develop-online.net/features/1208/Optimising-Convolution-Reverb




Frequency-Domain Truncation is the removal of low energy frequencies

* An IR is a recorded sample of a room’s response to short impulse sounds, which are applied to the incoming audio signal. Typically, rooms with long reverb times generate longer IRs and use more resources at runtime, whereas smaller rooms generate shorter IRs and consume less runtime resources. http://www.develop-online.net/features/1208/Optimising-Convolution-Reverb

One game that led the frontier in innovation of environmental sound was Crackdown, which received a BAFTA for its audio implementation. Raymond Usher said, 

"A revolutionary audio shader that uses the surrounding geometry to dynamically color and time delay the direct and indirect audio reflections." http://blindsecondlife.blogspot.co.uk/2007/11/crackdown-audio.html

Crackdown was a huge step forward in creating reverb that considers the surroundings of a virtual world. The video below is an example of the audio system used in Crackdown, called 'Audio Shader'. At 4:15 a section on reverb and reflections gives an insight into the system behind the audio. You can see a variety of lines, shapes and words which all contribute to the analysis of the surrounding geometry. 




The innovation through sound did not end at complex reverb systems in Crackdown, as Raymond Usher explains,

"We also hired an explosives expert to do controlled detonations for us. If you've seen the explosions in Crackdown, they're pretty massive. We took that as a challenge to make the biggest sounding explosions ever in a video game. By using a unique layering system, our recordings from the explosive session, coupled with the audio shader system... we definitely encourage you to turn it up." http://interviews.teamxbox.com/xbox/1885/The-Audio-of-Crackdown/p2/

Future Innovation

One innovative idea that is being thrown around by both games developers and audio engineers is the ability to simulate the voice as it travels through the body, mouth and into the air. Using highly complex analysis of the human body as well as the study of sound through virtual air, the outcome would be an intensely realistic sound that would be effected by the characters body. It is strongly believed that sound should originate from distinct positions in 3D pace, just as in reality. Realistically wave-tracing audio should require much less computation than realistically ray-tracing graphics. If this system of using wave-tracing as means to implement realistic audio into a game was adopted then it would produce the effects of perceived volume and position as well as frequency attenuation, reverberation and even the doppler shift effect.http://idcmp.linuxstuff.org/2008/10/wave-tracing-ray-tracing-for-sound.html

What is Ray-Tracing?

Ray-Tracing is a technique used in the computer graphics side of video games. It is a technique for generating an image by tracing the path of light through pixels in an image plane and simulating the effects of its encounters with virtual objects.

What is Wave-Tracing?

Wave-Tracing Audio is a theory presented to mimic the idea of Ray-Tracing in graphics. As explained in the Regular Expressions blog, regular Ray-Tracing, rays of light are traced backward from a pixel of the camera, to an object and eventually to a light source. If you can do that with light, why can't it be done with sound?

This is an interesting theory and is one that may be achieved if instead of tracing rays, vibrations would be traced. Instead of light sources within the game, the focus would be on air and the airs friction.

Prototype

Another more recent game adopted the initiative to develop a reverb system that replicates real life surroundings is Prototype. In prototype, all the ambience tracks were sent through a procedural reverb system. Scott Morgan explains, 

"Through a system of ray casting, the physical space of the listener was analyzed in real time, and the reverb parameters set to align with the size of the space that the listener was in."http://www.gamasutra.com/view/feature/132645/the_next_big_steps_in_game_sound_.php?print=1

An example of this real time analysis with in Prototype is explained by a sceneraio in the game where you a enter a tunnel in Central Park. The system detects an enclosed space of a certain size and dynamically sets the reverb parameters. In real time the sound of the park's birds and other ambient sounds are passed through the bigger reverb to give the illusion that the sounds are no longer arriving directly to the listener, but are reflected first in an attempt to replicate the real world. Scott Morgan 


References and relevant links

http://designingsound.org/2010/02/charles-deenen-special-the-future-of-sound-design-in-video-games-part-1/ 2010 - not really recent

http://www.prosoundeffects.com/blog/2012/06/gaming-sound-effects-generative-audio 2012

http://www.develop-online.net/features/1653/AUDIO-SPECIAL-The-generation-game

http://www.gamasutra.com/view/feature/130733/designing_a_nextgen_game_for_sound.php?print=1

http://www.develop-online.net/features/1685/In-Depth-Square-Enixs-Luminous-Studio-engine

http://www.gamasutra.com/view/feature/4257/the_next_big_steps_in_game_sound_.php

http://designingsound.org/2010/02/the-next-big-steps-in-game-sound-design/

http://en.wikipedia.org/wiki/Environmental_Audio_Extensions

http://interviews.teamxbox.com/xbox/1885/The-Audio-of-Crackdown/p2/

http://idcmp.linuxstuff.org/2008/10/wave-tracing-ray-tracing-for-sound.html

http://www.cs.princeton.edu/~funk/sig98.pdf

http://bjaberle.com/2011/02/the-future-of-game-audio/

http://www.gamasutra.com/view/feature/132645/the_next_big_steps_in_game_sound_.php?print=1

Wednesday 7 November 2012

Inspiring

Interesting film about the production of three indie games... Super Meat Boy, Braid & Fez

Youtube Trailer


Indie Games


Interactive Music Systems in Games

Interactive music systems


Computer games rely on the player to make decisions within the game, so music needs to react and interact to these decisions in real-time. As technology has advanced, new approaches to interactive music systems have been created and have now become just as important as graphics and gameplay. West B. Latta explains,


"Games are an interactive medium, and as such, the presentation of musical soundtracks must also be able to adapt to changing gameplay. To get a truly immersive experience, the music in games must change on-the-fly according to what is happening in the game, while still retaining a cinematic quality." http://www.shockwave-sound.com/Articles/C01_Interactive_Music_in_Games.html

The earliest forms of games were limited to simple 8-bit sounds that would play when triggered or heard as a short musical loop playing in the background. Classic games such as Zelda and Supermario started to show signs of development with the idea of an interactive music system, but were still bound by the technology of their time. The interactive music scheme in Super Mario Brothers music changed in tempo as the player's time was running out.

An interactive music system reacts to the player's choices within the game, such as finding hidden objects and entering certain areas. Project Bar-B-Q 2003 discusses interactive audio systems and questions what an interactive audio system is as well as what it should be. 

"An audio system that is designed to have its pre-determined sonic behavior influenced in response to real-time events, and is comprised of an Interactive Audio Engine and Interactive Audio Data". http://www.projectbarbq.com/bbq03/bbq03r5.htm

Below is a flow diagram of the Interactive Audio System in games


http://www.projectbarbq.com/bbq03/bbq03r5.htm
Some argue that music in video games does not require an interactive element and a musical score will suffice. However, some of the benefits of an interactive music system seem more appropraite than a simple cinematic score. Here are some of 'project bar-b-q's outcomes:


Why Interactive Audio?
  • It enhances the user experience.
  • It empowers the user via participation and choice, facilitating the market trend toward active consumption.
  • It provokes and inspires user involvement.
  • It creates a unique personality for products.
  • It enables users to perform new types of activities.
  • It creates a participatory education experience.
  • It's potentially cheaper to implement.
  • It allows simplification of the system and cost reduction.
  • It allows audiences to experience interactive audio outside of its original context.
http://www.projectbarbq.com/bbq03/bbq03r5.htm

Andrew Clark said,

"It would be really cool if game music could complement onscreen action with the same kind of subtlety, depth, and expression. The complication is that, in games, the timing, pacing, contexts, and outcomes of the onscreen action are constantly in flux, depending on the actions of the player." http://www.gamasutra.com/view/feature/129990/defining_adaptive_music.php?print=1

One of the first interactive music systems to be used in games was 'Direct Music Producer' which was a component of Microsoft DirectX. It allows the user to create music and sound effects that would be selected by the users choices in the game. It enables the gamer to experience variation in the music and sounds. 'Stainless Steel Studios' was a games company that adopted the use of 'Direct music Producer' in games such as 'Empires: Dawn of the Modern World'. Sound designer Scott Morgan was drafted in to create an interactive music system for 'Empires: Dawn of the Modern World'  due to his experience with Microsoft and 'Direct Music Producer'. Here is Morgan's demonstration of 'Direct Music Producer';




Middleware

In modern production of video games, there are two leading pieces of software that are integrated into games... Fmod and Wwise.

FMOD

Through event systems the composer can utilize multichannel audio files or 'stems'. This allows certain individual instruments or sections to be added or subtracted based on games states. or any other dynamic information fed into FMOD such as health, location, proximity to certain objects or enemies. FMOD takes a more 'logic based' approach and allows the designer to define various cue, segments and themes that transition to other cues, segments or themes based on any user-defined set of parameters. FMOD allows for beat-matched transitions, and time-synchronized 'flourish' segments.



FMOD was used in

- Splinter Cell: Chaos Theory
Depending on the level of 'stealth and stress' of the player, different intensities of music would begin to be brought in. This is called 'vertical' approach to music system design

- Tomb Raider: Legend
Troels Folmann used a system which he devised called 'micro-scoring', which is crafting vast number of small musical phrases and themes that were then strung together in a logical way based on the players actions throughout the course of the game. An example in the game: the player may explore a jungle area with an ambient soundtrack playing. As they interact with an artifact or puzzle, a seamless transition is made to a micro-score that is specific to that game event.

Wwise

Wwise is the product of AudioKinetic, which allows multichannel stems to be intergrated into a "logic-based" approach to music design. With these features composers can create a series of themes with time-synchronized transitions which are created through game events or states. It also allows other parameters to fade various musical stems in and out of the mix, creating smooth transitions and a more professional finish. Wwise is a system that incorporates both a horizontal and vertical approach to music design. (West B. Latta)




Wwise has been used in a huge amount of games

Such as:
- Batman: Arkham City
- Halo Wars and Halo 4
- Mass Effect 2 & 3
- Star Wars: The Force Unleashed 2
- Assassin's Creed 2, 3 & Brotherhood

Wwise also includes features such as, cross-platform sound engine, real-time game simulator, plug-in architecture and SoundFrame API 


Video Example of Interactive Music System


Totally Games / LucasArts’ “X-Wing” series

"The “X-Wing” (PC DOS) series, which debuted in 1993, featured MIDI versions of John Williams and John-Williams-esque orchestral music. Lucas Arts’ patented iMUSE music engine handled sophisticated run-time interactions between dramatic onscreen action and a database of music loops, cues, and transitions. (Evolving versions of iMUSE were also used on a number of later Lucas Arts projects.)" http://www.gamasutra.com/view/feature/129990/defining_adaptive_music.php?print=1


Here is an example of an Interactive Music System being used in conjunction with the timing of an online game of Battlefield 3. What you hear at first is only sounds from within the game, such as gun fire and explosions etc. However, at 0:48 in the video the enemy team only has five remaining lives and are about to be defeated. At the moment the number reaches five the music starts, which builds in texture until the game concludes. The music then continues to play in full orchestration during the scoresheet screen.


The use of interactive music in Battlefield 3 is used in every game type, and used in a similar manor. As a team is reaching the end of a game, the music starts. Each team hears different music, which will either be victory music or defeat music. The increase in musical texture gives the players a sense of urgency and can often push them further in the game with added adrenaline.

Another example of Battlefield 3's interactive audio implementation is through dialogue in online games. Using game type 'conquest' as an example, the dialogue will be heard when a flag has been captured. But the intensity of the dialogue will be assessed on how many of the flags have been lost in total. So if your team has three flags captured but the fourth has just been lost, it will only be explained but with little expression. However, if the you had just lost all the flags, the dialogue would be screamed out, to suggest that you need to work harder.

Tuesday 6 November 2012

The Sounds of Batman: Arkham City


Direct Music Producer

'Direct Music Producer' is a piece of software that allows the user to create interactive music for games. It allows for huge variation in the sounds and music that is played in the game, and could often be designed to be different every time. Here is a video created by Scott Morgan explaining his use of 'Direct Music Producer'.