top of page
Search

bUSrOUTes

Updated: Feb 6, 2023

As a result of the Covid-19 pandemic, remote performances had become more normalised. It provided an opportunity for a different experience of live music altogether, but also new challenges. As technology becomes more advanced, so does the way in which we experience live events.


Project bUSrOUTes was born when UK jazz legend Evan Parker was no longer able to travel internationally to perform, so we devised a solution for him to be able to perform from a UK venue using domestic broadband speeds to connect and perform with artists outside the UK. Using open source software, we managed to achieve a low-latency multi-channel bilateral audio feed with an ensemble in Hamburg, Germany and Evan Parker’s ensemble in Faversham, UK performing simultaneously.


The feedback we received from the performers was that it felt like they were performing in the same room together. The only issue was the projected video feed between the two countries had a considerable delay, so visually the performance was out of sync with the sound.


After participating in a Screen South VR workshop we began experimenting with motion capture and game engines, creating avatars to animate the performers in real-time over VPN. We achieved a comparable low latency to the audio as a result of the real-time rendering.

We built a small ‘Pepper’s Ghost’ rig that allowed us to project the avatars holographically onto a stage where our local UK musician could perform alongside European musicians in real-time with very low latency.


Our next step is to achieve a much lower latency performance over longer distances between performers and complex performance behaviours and longer times between calibration and potentially without the need for expensive motion capture devices by using only simple camera technology available cheaply to everyone worldwide to capture human movement.



Benefits of Remote Performances

  • Less carbon footprint (no need to travel via air or land)

  • Available to a wider audience

  • Benefits audiences who can’t physically travel to a venue

  • Gain exposure to talent anywhere in the world

  • Performers who find it difficult on a live stage in front of an audience

  • Visa issues

  • Cost and time issues

  • Rehearsing efficiency

  • Wider network for collaboration

  • Educational advantages

  • Alter egos can be represented through an avatar (benefiting non-binary individuals)


SonoBus


SonoBus is a free open source application for streaming high-quality, low-latency peer-to-peer audio between devices over the internet or a local network. It works as a virtual mixer, that allows performers to sync to the same server and play together with minimal latency. During various lockdowns, Sonobus was a solution for performers across the globe to be able to perform together despite the restrictions in place.


Sonobus allows us to connect musicians playing from various locations around the world into the same audio feed, that can be amplified locally to an audience.




Pepper’s Ghost


One of the oldest traditions in performing arts is the use of illusion to stage seemingly impossible features, like the Pepper Ghost effect. Magic has charmed and transported audiences for centuries. The ghost room can either be black or a mirror image of the main background. A sheet of glass is positioned in front of the audience and set at a 45 degree angle to both the audience and the ghost. At this angle the background remains clearly visible but the glass also partially reflects an image of the ghost. To the audience, it appears as though there is a transparent ghost in the scene in front of them.


Credit: https://magic-holo.com/en/peppers-ghost-the-innovation-from-the-19th-century/


Test Rig


We constructed a Pepper's Ghost maquette test rig comprising of a toughened glass screen at a 45 degree angle and the scaffold taking the weight of the toughened glass. An LED monitor is placed below the glass and projected up onto the glass to create the reflected illusion of a hologram onto the performance stage. The room needs to be dark with lights positioned on stage with a black backdrop located behind the stage for the effect to work.




MVN Animate/MVN Awinda


As the motion capture suits we are using works by combining sensors and a radio receiver, there is no need for an expensive motion capture studio with high-spec cameras. The motion capture suits from Xsens come in different varieties, and work in conjunction with their own software MVN Animate. Each sensor is labelled with the specific body part that it is configured to, which is then attached to straps fastened around each specified body part.



Credit: Stefan Ostersjo & Nguyen Thanh Thuy



Blender


Blender is a free and open-source 3D computer graphics software tool used for creating animated films, visual effects, art, 3D models, motion graphics, interactive 3D applications, virtual reality, and, formerly, video games.


As Blender is free to use, with tons of tutorials online to learn the software, it was the perfect software to create our own 3D models of the instruments that will be played during the performance.





Instrument Models


Blender models work seamlessly with Unity games engine as long as they are exported correctly. The instruments that Stefan and Thuy play are both traditional Vietnamese instruments with a very distinct appearance, so a degree of accuracy was important for the 3D models. After showing the models to Stefan and Thuy prior to the performance, they were both satisfied with the appearance of the instrument models and agreed that they looked authentic and accurate.





Networking


In order to receive the visual data from the performers in their motion capture suits, the data has to be sent via the internet. Initially we attempted to create a multiplayer server, that each performer could join and perform together in a virtual environment. We managed to successfully ‘spawn’ in two avatar models that could move within an X,Y,Z axis, however when we tried to animate the avatar models using the motion capture devices, the network script couldn’t understand the motion capture data as it was too complex for the script. As MVN animate has a network streamer function that allows you to stream the motion capture data to a target host IP address, this then led to the idea if both the sender machine (performers) and the receiver machine (the audience) were connected to the same IP address then we could receive the motion capture data from the performers into Unity and then project the image onto the screen on stage. To achieve this, we needed to set up a VPN with a static IP address for both us and the performers to be connected to simultaneously, whilst allowing data to be sent through the port address within the IP. After some thorough research and testing various different VPN’s, we found that LogMeIn Hamachi is a VPN that allows motion capture data to be sent through internet ports. This VPN also has a function that allows you to control a remote desktop, which is perfect for setting up and calibrating the performer’s motion capture suits.




Data Transmission


By sending and receiving the data via a VPN, this allows the performers to be located anywhere in the world and allows us to set the stage in any location as long as there is an internet connection. The basic workflow of the data transmission is as follows: The performers first link up to MVN Animate, where they will calibrate their avatar models with the motion capture suits. Then the performer’s enter the IP address of the receiver computer which is provided by the VPN into the network streamer on MVN Animate. They then enter the port address provided by Xsens’ Unity stream reader script, so that we can receive the motion capture data in Unity. With the receiver computer connected to the LED screen, we then project the real-time motion capture animations from Unity up onto the pepper’s ghost glass screen.



Unity


MVN Animate have developed plug-ins that work seamlessly with games engines such as Unity and Unreal. The plug in can be installed straight from their website, which contains the generic avatar model along with all the motion capture and streaming scripts, making the initial setup straightforward. We decided to apply a wire-frame shader to the generic model, to give a unique and abstract aesthetic which represents the music being generated during the performance. For the instrument models, we felt that they should also have a similar appearance to the avatars but not look exactly the same. So we decided to apply a wire-frame shader with a slightly different textural aesthetic but keeping in theme with the avatar models.



First International Performances


Hamburg

Our initial bUSrOUTes performance used an audio feed via Sonobus and a live streamed video feed using Zoom. This performance was characterised by a special feature: the evening concerts took place simultaneously in front of audiences in Hamburg and the UK, as Evan Parker and Matthew Wright were connected via an audio feed from Faversham to the ensemble in Hamburg. This should not be seen as merely a concession to the coronavirus pandemic situation but more importantly as an exploration of new concert formats and collaborative practices based on recent technological developments. Despite the incredible performance, we felt that there was room for improvement regarding the video feed as there was a noticeably significant delay between the video feed and the audio. This led to us to researching real-time games engines and avatars in place of real-time video feeds for future artistic visual performance.





Sweden

Our first real-time avatar and human performance combined ancient traditional Vietnamese music with a more experimental improvised interpretation. Stefan Östersjö and Nguyen Thanh Thuy played traditional Vietnamese instruments, the Dan Ty Ba and the Dan Tranh with Matt Wright live remixing their audio feeds on electronics and turntables. Stefan and Thuy were located at their home in Sweden, with Matt located in Folkestone, UK. They were able to live stream their motion capture data in real-time to us with a delay of around 40 to 60ms. This delay was barely noticeable and the audio-visual synchronisation was acceptable but not perfect.


To improve future performances, we would like to improve the audio-visual latency and refine the interactions with the instrument models, by implementing a prop sensor that allows the instruments to remain in sync with their bodies. Xsens already have a prop sensor that works with their motion capture kits, however they are only configured for props that are held with one hand, for example a sword and not a two handed prop such as an instrument. This is something that we are working with Xsens to help develop.




Conclusion


We are very satisfied with the outcome of the initial performance and are excited about the possibilities that this cutting-edge technology can allow us to achieve. As far as we are aware we are the first arts organisation to attempt a local and remote performance using humans and avatars that are animated in real-time rather than pre-recorded and we are working to improve our technological breakthroughs with emerging tech and further research.


This project has spawned a variety of new ideas in which we can use this concept for future performances, such as interpretive dance or visual art mediums. There is still much more to develop and refine both technically and conceptually which is an exciting prospect for us. As bUSrOUTes is not limited by location there are many possibilities for collaborations among different artists and disciplines.



Technical Improvements


Unreal Engine

To develop and refine this project further, we would like to firstly switch to the Unreal engine. This engine will allow us much more graphic quality, especially in terms of lighting with avatars. It also offers node based back-end development making it user friendly and enables us to work quicker and make use of additional plugins to give us more versatility working with a broader range of artists.



Mylar Screen

Our future full scale performances will use lightweight mylar fabric for avatar holographic projection combined with a full stage for human performers.



Pose detection

Motion capture devices are expensive so we will be researching the use of simple camera technology available cheaply to everyone worldwide to capture body movement.





What’s Next


We intend to tour our research project in a variety of different venues and locations, opening it up to the wider public to experience.


Our next bUSrOUTes performance will involve further development of remote real-time audio performance over a considerable longer distance, some 3,500 miles.

The Evan Parker Trance Map ensemble involving 13 musicians located in Kent, UK and Brooklyn, New York. The challenge will be to achieve the most minimal amount of latency possible to date.

Tickets are available by clicking here for Kent, UK and here for Brooklyn, NY.


We will test other artistic mediums such as real-time interactive digital visuals and human dance making collaborative performances with other artists both local and internationally.





Collaborators



The Six Tones


The Six Tones is a platform for an encounter between traditional and experimental cultures in Asia and the west. The core of this practice is, since 2006, an ongoing project of mutual learning between musicians from Vietnam and Sweden. The Six Tones is a group that plays traditional Vietnamese music in hybrid settings for Western stringed instruments and traditional Vietnamese instruments and improvise in traditional and experimental Western idioms and also commission new works in collaboration with artists in Asia as well as in other parts of the world.


Credit: http://www.thesixtones.net/?page_id=248



Matt Wright


Matt Wright is a composer, producer and sound designer based in Kent, UK. His output stretches from scores for early music ensembles and contemporary chamber groups to digital improvisation, experimental turntablism and website installations, alongside collaborations with dance, theatre and film. As a performer he works with turntables, laptops and surround sound installations to create post-DJ, multichannel music embracing hip hop, avant and improvised traditions.


Credit: https://www.matt-wright.co.uk/



Evan Parker


Evan Parker, the British jazz saxophone revolutionary who transformed the language and techniques of the instrument in the late 1960s and has since become one of the most admired and influential saxophone improvisers on the planet. Parker has been rewriting the book on the sounds that can be made with a saxophone for almost half a century, developing a remarkable post-Coltrane technique that has allowed him to play counterpoint on what was designed as a single-line instrument, generate electronics-like textures acoustically, and build a personal soundscape that avoids conventional tunes but has its own arresting lyricism. Parker has worked with comparable revolutionaries like John Zorn and Anthony Braxton, and played in experimental electro-acoustic groups and contemporary-classical ensembles - but he has also brought a sharp edge to more orthodox jazz lineups led by Stan Tracey, Kenny Wheeler and by the Rolling Stones Charlie Watts, and the celebrated South African orchestra Brotherhood of Breath. He has also recorded with singer-songwriter Robert Wyatt, with TV comic Vic Reeves, and has lent his inimitable sound to the more pop-oriented contexts of Scott Walker, David Sylvian and Jah Wobble.


Credit: Caroline Forbes



Electro-Acoustic Ensemble


The ElectroAcoustic Ensemble was formed in 1990 as a sextet to explore the possibilities of real time signal processing in an improvising context.

It has grown to an astonishing eighteen piece chamber orchestra which will perform in Lisbon in August 2010 at the Gulbenkian Foundation for Jazz Em Agosto.


Each stage of the evolution of the group has been documented by ECM records working with producer Steve Lake. As in all other areas of life, the computer now has a vital place at the heart of music making. As the technology develops I hope to keep pace with an ensemble that can exploit the new possibilities that arise.


Credit: https://evanparker.com/electroacoustic.php



Hi-3 Network


Hi3 Network is led by London South Bank University in partnership with Canterbury Christ Church University, Creative Folkestone, Maidstone Studios and Screen South. We first discovered the Hi-3 network through attending the ‘Immersive StoryLab’ workshop at Screen South in Folkestone. During the workshop, we conceptualised ideas for how we can implement virtual reality technology to create something new and exciting, but also directly benefiting the public. These conceptualisations helped us to form our visual representations for bUSrOUTes and inspired us to research remote performances using emerging VR tech.




Screen South


Screen South is a project led organisation. Focusing on promoting new ideas, engaging audiences, delivering excellence and opportunity for all.


Working with partners across a range of digital and creative projects in arts, heritage and on screen. Bringing hidden stories, talent and ideas to the public gaze and back that up by nurturing good relationships with people and partners locally, regionally, nationally and internationally. They focus on nurturing a diverse range of new talent with training and progression as well as generating new productions, showcases and exhibitions.


We were given the opportunity to use one of their rooms at Glassworks, Folkestone, UK to record our first real-time VR performance with Sweden, allowing us to create a stage and Pepper’s Ghost rig.





bottom of page