Sound/Social is an interfaceless collaborative music creation environment. Using motion tracking and software that delocalises and redistributes sound within a space, users are encouraged to explore the environment in new ways.
When someone walks into the space they activate a simple sound loop that follows them around where they walk. As people move closer to each other they start to hear their sound loops interact and play with each other.
As a user walks through the environment, their sound loops stays relatively consistent to them, but what they hear from other peoples sounds changes, creating a unique musical composition for each user in the space, all from a set of simple repeated sound loops.
For trialing the installation I have chosen to use the Portland Square Building at Plymouth University. The building has a sophisticated sound system using a network of speakers that I can control from one source. There are also existing surveillance cameras in the building that I will be working with to assign and control the sounds within the environment.
Portland Square is one of the most active buildings in the university with 3 huge lecture halls and multiple floors of workspaces and offices, including the i-DAT office. It is an ideal location as the three atria are very open, have good acoustics and a high traffic flow of people.
The installation will only be active at quieter times during testing, but if all goes well I hope to be allowed to have it running during busier periods as well.
For the installation I utilise 4 speakers in Atrium C of Portland Square. These speakers are sent signals by a patch I have created that calculates what volume should be played at each speaker based on the location assigned to each individual sound loop.
These sound loops move around in the space based on motion tracking data gathered in Max MSP from cameras in the environment. The motion tracking takes in a medium-resolution camera feed and uses modified cv.jit objects to first detect each person in the space, and then separate the values so they can be interpreted individually by the sound distribution side of the patch.
The laptop I use to run the patch can only comfortably handle 4 speakers with a maximum of 8 sounds being moved around the environment. I have, however, created the patch so it is very easy to increase both the number of speakers and number of objects in the environment for larger scale projects with better hardware.
View my full documentation for the project here. This includes my thought process, the Max MSP patch I created and a more in-depth description of the project as a whole. You can also view my Development Blog over on Wordpress.
Sound/Social currently only has one library of musical loops, which I created using Andre Michelle's Flash Tone Matrix. Each individual loop is four, eight or 12 seconds long.
The movement and interaction with others in the space is what makes the simple, repetitive beats, something far more dynamic and pleasing to listen to.
View other videos in my Sound Social Youtube playlist.
If you would like to come along and try out Sound/Social for yourself, I will be exhibiting the project in the bottom floor of University of Plymouth's Portland Square Building at some point in early June. The installation will span all three of the Atria and the system utilises the 12 ground floor speakers.
All you need to do to participate is show up, walk around and meet the other participants. Your loop will be automatically assigned to you and all you're expected to do is have fun performing music simply by interacting with others.
Apologies, Sound/Social is not being exhibited anywhere any time soon!
I simply wouldn't have been able to realise my ideas without the help of many other people. I would like to thank Dan Livingstone, David Strang, Chris Saunders, Guido Bugmann, the Plymouth Uni Security team and anyone that has participated in the project.