%!TEX root = ./HDG_report.tex
\section{Background}\label{background}

In academia and the musical arts, methods of creating procedural audio have been the subject of research for decades. In the games industry, the usage of such techniques has been more limited. In the following sections we give a brief outline of related work within music and sound effects as well as a specific look at the use of granular synthesis. 

\subsection{Music}

Our approach is not directly related to procedural music techniques, but the intent of increasing adaptivity and variation is shared.

The classic paradigm is algorithmically composed music, as in music that is controlled by the computer on a highly detailed level. In games, given their interactive nature, an algorithmic composition is never completely detached from game events. There is usually a loose coupling with game events, making the player indirectly involved in the composition. An example of this is the ``riffology'' technique employed by Peter Langston in \emph{Ballblazer} (LucasArts 1984) \cite{langston1989eedie} and more recently, Brian Eno's use of cellular automata for the music heard in \emph{Spore} (Maxis 2008). 

A number of games have attempted to make music an integral part of the gameplay, giving the user a higher level of control. The music is created directly in correspondance to the users' actions, effectively making the game a sort of instrument. 

Some examples of this are:

\begin{itemize}
\item \emph{Otocky} (Sedic 1987)
\item \emph{Rez} (Sega 2001)
\item \emph{Electroplankton} (Nintendo 2005)
\end{itemize}

Yet another approach is to stream macro-level music components according to game events using some form of centralized control entity. The first example of this is LucasArts' interactive music system, iMUSE \cite{imuse1994patent}, which was used to great effect in their adventure games from the nineties. In the first-person shooter \emph{Left 4 Dead} (Valve 2008), the AI director monitors and alters gameplay, including the mix and arrangement of the music. The upcoming fourth installment of the stealth game \emph{Thief} (Eidos Montreal), incorporates a randomised file player using basic logic called \emph{GRAMPS} to create generative music \cite{weir2011gdc}. A survey of procedural game audio, with an emphasis on music, can be found in \cite{collins2009procedural}.

\subsection{Sound effects}
This is the domain in which our approach belongs. Farnell \cite{farnell2009designing} describes theory and practice for synthesising often used effects for games like explosions, gunshots, footsteps, fire, running water, etc. A number of the practicals given there are used by Knight in \cite{knight2011dissertation}, utilizing the Boot Camp demo in the 3D game engine Unity \cite{knight2011video}. A recent example of a commercial application of sound effect synthesis is the third-person shooter \emph{Crackdown 2} (Ruffian 2008), where modal synthesis is used to create dynamic impact sounds \cite{lloyd2011synthesis}.

\subsection{Granular synthesis in games}

To our knowledge, not much research has been done in the field of granular synthesis in games. Reflections on the possilities can be found in \cite{paul2011granulation} and \cite{paul2007gdc}. 

Among the possible usages considered are:

\begin{itemize}
	\item Time warping speech for variation.
	\item Changing the formants of voices.
	\item Crossfade morphing.
	\item Spatialization in 5.1/7.1 surround.
	\item Sliding/scraping effects for contact physics.
	\item Adding variation to existing sounds.
	\item Stretching a sound for a granular drone.
	\item Tempo-sync of grains for gating effects.
	\item Merging ambiences by selecting grains from multiple files.
	\item Making an ambience sound more musical and abstract.
\end{itemize}

Many of the above ideas are concerned with processing a game's final assets, to create variations and reduce the memory footprint. Few of them seem to have been implemented yet in a commercial game. One concrete attempt at using granular synthesis in games is retargetting contact sounds in physics-driven animations \cite{picard2009retargetting}, and some of the steps taken are similar to the ones used in our method, but again, theirs is an top-down approach concerned with creating variations of finalized sounds.