Monday, 15 July 2013

MFP UPDATE - Gesture and Interactive Soundscape

As I am investigating the most effective relationship between gesture and audio and their ability to create an immersive presence within audio-only games, I am building different types of systems in max/msp to test these possibilities  This post explains the most recent patch in development which is a gesture-audio system, which allows the player to use gestures to interact with sounds in the virtual world. It is still in its early stages of development, but below are some screen shots of the system and what mechanics are being implemented to make the system work.

Firstly, it is important to note that the gesture capture method is the same in each patch to avoid a bias in results. I am utilising a common webcam and colour tracking.

As the game involves the player to return a 'pitched'* ball as it approaches their static body position, I felt it would be best to start creating impact sounds for their surroundings which they could reach out with their hands/bat and interact (sonically) with. With the player hitting a 'pitched' ball back and forth with a virtual bat I started to think of the surroundings of a baseball ground, batting ground, cricket ground etc. With the user being currently confined to a static body position, this reminded me of batting grounds where there are usually chain-link fences surrounding the player to prevent the ball from flying off into the distance. So, with this in mind I started to create a sonic boundary around the player, which can be interacted with when the player gestures towards the boundaries of the webcam image (less than 20 to be precise). Currently I have only one side of the x-axis implemented but the idea works just fine, so further development should establish a sonic boundary for the player.

Below is an image of how the interactive sonic-boundary is being implemented in max/msp. I am making use of a non-repetitive sound design technique, which can be seen by the URM system on the bottom right of the image. This creates a non-repeating random number that affects the playback speed of the sfplay~ (sound file). The sound file chosen is a bat hitting a chain link fence and when triggered will play a different sounding version every time. More audio files will be added to improve the quality of audio selection as development continues.


The sfplay~ (audio file) is triggered when the tracked gesture ventures less than 20 towards the left. However, the first version of this system meant that if the player went too far left and left the screen (so 0), and then came back on the screen (going up from 0 but still less than 20), meant that the sound file was triggered again. This broke the illusion of a virtual sonic environment as their hand was triggering the sound on the return. So in an attempt to solve this issue I implemented a system which acknowledges the direction of changing numbers (left & right), which can be seen in the middle of the image (box that says left). When it changes to left, it produces a 0, which in tern produces a bang 0. If this 0 is matched by a gesture tracked number less than 20... it triggers the sound, but it won't play unless the data has completely change, i.e. gone back into the virtual space.


* The pitched ball refers to the current method of using sound to convey a ball travelling closer and further away from the play. It uses a C-Major scale ascending and descending to reflect its distance from the player - it works, however attempts to find an alternative are still under way.

No comments:

Post a Comment