By Yiming Li
The inspiration for the project came from some previously seen works and some of my personal experiences. Basically it’s a project about sound with some playability. I’ll be aiming for some immersive experiences in the process, as well as utilising as many previously learnt methods as possible.
Regarding sound installations, I have seen many works such as Brendan O’Connor’s ‘Sound Balloon’. This, to a certain extent, provided me with some inspiration for the whole project setup. He edited the sound logic for each balloon, which to me was more of a ‘sequencing’ process, although he used more Arduino and Piezo discs in the process.
Similarly, his project is more of a presentation that combines multiple channels of audio into one. I’m hoping to do some research in the ideal 5.1 channel or stereo situation.


Having tried some simple arranging projects myself, I realised that what a lot of people have been doing in recent years is ‘arranging visualisation’. It’s more like a process of making the boring act itself more interesting. I would love to implement this idea into my projects.
Regarding the specific part of the production, I used a number of methods, which for me personally was more like an experiment in linking multiple software.

I drew up some plans before we started, and I wanted to make it more of a ‘sound space’ where the audience could have some fun playing with sound. It could be little balls that make sounds, arranged like a musical arrangement, and I could manipulate the balls and add different effects to them at the same time. At the same time, some visually and aurally interesting presentations were used to better fit the theme of the project.



I started by recording different clips in Garageband with a midi keyboard using different timbres, containing a variety of instruments and different kinds of playing styles and tones. They were all set to the same playing tempo and pitch. I also did some preliminary trimming and editing of the audio using AU. In fact I feel that AU was used a lot throughout the production of the project.



I initially built the sound space in unity. I set up the whole space in black to look like a rehearsal room, with two spotlights pointing to each end – one side as a ‘storage area’ for the balls, and the other side as a ‘play area’ for the to play the sound. The audience can interactively move the spheres from one side to the other, selecting different kinds of instruments, tones and styles.


Adjusting the appropriate 3D audio curve for each blob, i.e. their vocal range, is more like mixing, and I’ll keep listening to the effect of each value on each blob. unity comes with a sound plugin that allows for a more logical volume decay mechanism throughout the blobs, so that as you move away from the blob you can clearly hear the pitch of the sound change instead of a simple change in volume.


I also wrote some scripts to help the whole interaction happen more smoothly. For example, making each blob play audio neatly, as well as mute it when it’s in the ‘storage area’ and make it sound when it’s in the ‘play area’. I did a lot of debugging to keep the length of the audio consistent within milliseconds, and to work around the unity loop bug that can cause it to go out of sync when played multiple times, as well as basic interactions such as grabbing balls and placing them in different places, and a collimation detector that displays a UI with the ball’s name, which I won’t go into here.
Here’s a little plug about the process of designing the blob itself. I had thoughts of attaching some interesting textures to them to make the scene less monotonous. I did some experiments, such as using the material overlay option in the video player to make some dynamic materials for the spheres that move with the music, and I also tried to make some different styles of FX videos in AE for this effect and attached them to the spheres. In the end, I chose to make some self-luminous materials by hand, and the result is not bad.


Up to this point, the building part of the project is roughly finished. Regarding the visual presentation, I decided to incorporate VR into the project after learning about VR-related works in previous classes and being recommended this approach in a chat with my tutor. Again, I won’t go into the technical aspects of Unity here. The result was good, VR brought some immersion and interactivity to the project. This next video is a live recording of myself before the presentation.

In terms of reflecting on the project, I think I always get caught up in thinking about the ‘playability’ of the whole project without realising it, which can lead to a slight deviation from the title. But I don’t think this is all bad, as it allows the project to continue to improve in the direction of ‘musical arrangement’, even though it may be in a very different direction. Nowadays we can certainly find some very professional and sophisticated arranger software to meet the needs of musicians, but there are very few avenues of arranging music for the general public. If there’s any way to allow people with 0 musical foundation to produce their own songs, while still having enough playability and interactivity, the approach demonstrated in this project is not a bad way to think about it.