Ensemble is a VR experience that places the user in a realistic concert environment with an audience and ensemble. The musician is able to practice performing a piece in front of a simulated audience, to help combat stage fright.
Architecture
The frontend and all visual aspects are implemented using Unity3D and compatible C# scripts. Autodesk Maya is used for modeling and animations.
The backend will be coded in Python in a separate companion app. This will allow the visual aspect of the project to run at full capacity without worry of computational exhaustion as a result of numerous probability calculations. The backend repository can be found at: https://github.com/dartmouth-cs98/20w-ensemble-vr-score-following
In your Oculus headset, click on Settings, then See All, and then Experimental Features. Scroll down and enable Hand Tracking as well as Auto Enable Hands or Controllers.
Deployment via Unity3D
Clone this repository into any folder. Then, open the project in Unity! An Oculus Quest is required to run the application, so plug in an Oculus Quest into your computer via USB Type-C cable. Enable USB debugging on the Oculus Quest.
Go to Build Settings, and ensure that it is building to Android. Go to player settings and ensure that Quest is set as the build target. Then, click Build and Run, save the .apk into any folder you would like, and enjoy!
Connect the oculus to your compute and use Android File Transfer to move the .apk into your oculus.
Run the .apk on your oculus!
How to Navigate the Game
Once the game is started, point the ray shooting out of your right hand at the "Play" button and pinch your index fingerand thumb together.
There is a also a tutorial you can use to practice the interactive hands.
Once you begin, you will be taken to the backstage room. There, flip through the music book pages until you find a piece you want. Currently, there are only 3 pieces to choose from (Pachelbels Canon, Let It Go, and Beethoven's Symphony No. 9).
Once you are flipped to the right piece, turn right and pinch on the door to go to the main stage.
Pinch on the green play button on the podium to start the music.
Practice to your heart's content!
The Beethoven 5th Symphony (last piece in the book) is a short 10 second snippet, so choose this for a quick demonstration!
Notes on Score-Following
In the Oculus game, you can change the accompaniment from static (doesn't follow you) to score-following. To use score-following, you must have the Python server running. Then, you must type in the IP address in the Oculus game itself. Currently, the score-following mode works, but the response is delayed by around 0.5 seconds, which makes for a really awkward experience.
Use the link to the score-following repo above and follow directions on it's README to set up the Python server.
Improve user interface and user experience by making the users "know" that they are interacting with objects. (i.e. sound when you pinch, knowing where to pinch)
Currently, the game only supports hand interactions in the form of touching objects with the fingertips. However, often times, objects are out of reach of the player's game area, so we need a way to interact with objects from a distance. This involves ray projections and hand-gesture recognition to understand when a user wants to interact with the object and when he/she does not.
In order to provide realistic and immersive experience to musicians, it is critical to have reliable 3D models. This sprint focuses on building the scaffolding for the 3D guitar model, focusing on the guitar body. This also includes learning the basics of Maya.
The current instrument models are crude and overly simplified. This task is to add more faces and vertices to the mesh and more accurate textures to more accurately mimic the instruments.
Write a script to change the tempo of an audio source in Unity without affecting pitch. This should prove useful if we decide to adapt a tempo adjustment approach in our accompaniment.
Write and test a script that plays a tone given a key, duration, and instrument. It should be fairly simple to lessen the load on the Oculus Quest. Make use of the audio components in Unity.
Building on top of the initial set up and objects, this issues involves adding scripts that handle muting, pausing of the scene, focusing on helping user adjust to the interface.
This issue is to continue studying the tempo-tracking literature to find a paper with a tested algorithm, and then figure out how to best implement the algorithm
For week 1, I used a toy scene to work on the concert hall architecture. The toy scene is separate from the frontstage scene, so I will have to integrate my work from the toy scene into frontstage
Set up real-time oculus integration so that when I add edit a script or add an object, I can see the changes in real time. This will be useful for all the upcoming issues, especially hand gesture detection.
Right now, each component of the backstage acts independently from the rest of the game. This task is to have the music selection, instrument selection, and back-to-front stage doorknob integrate with the game.
There are several features that support user's immersive experience as he's playing along with Ensemble. There are basic motion features that allow the user to move around the stage. These actions are incorporated within the 'pause' menu that allow users to 'restart' or 'move backstage'. This pause itself is triggered by another button that is easily visible to the user.
Have the music playing in the background that pauses when the user triggers it --> "replay" + "start" buttons pop up --> follow their functions
This issue is to use the BPM value obtained from the tempo tracking script and translate it into a speed multiplier that can be applied directly to a .wav or .mp3 file.
Several instruments have been roughly modeled. This issue focuses on refining those instruments (Guitar) so that they are ready to be staged in a week, specifically the strings of the guitar.
Write a script that holds a collection of note playback objects dedicated to specific instruments and calls those objects when prompted to do so. This functions as the controller for accompaniment playback.
This is an example of a chair in which the virtual audience sits. This has to be set up correctly to portray an accurate representation of how a real audience is seated in a performance.
I am able to fully understand and implement one of the many score following/tempo tracking algorithms described in online research papers
I am able to show Ensemble to musician peers, and see that this application not only works but is relevant and desirable.
I release this application on the VR market for the masses to download and try out.
Challenges
what are some challenges you foresee?
A lot of the research papers online use difficult technical jargon and concepts, which I am unfamiliar with.
This application may be difficult to release officially because it is targeted at such a specific audience of soloist musicians.
It will be hard to user-test our application because not many people have access to VR headsets. It will be even harder to find musician peers with VR headsets.
During Technigala, we found that the user interaction with the backstage was unintuitive. This issue is to figure out a way to make it more seamless and easy for the user.
Currently, the backstage scene functionality is completed. A user is able to select a piece, select an instrument, move around in the scene, and move on to the front stage. However, the interactions feel clunky and unintuitive. This issue is to touch up the interactions with the music book, instruments, movement tiles, etc to make a more immersive and intuitive interaction experience
First, familiarize yourself to the basics of MIDI, wav, and how Unity handles music. Then, import sample classical music files to the main stage scene in Unity. This starts happening after the backstage. If the music is over, ending UI pops up to tell the user that he can either replay or choose new music.
To help with merge conflicts, this issue is to split up each of our scenes into multiple portions such that more than 1 person can work on a room at once. For example, front stage could be split up into the front stage models scene and the front stage UI scene.
In order to provide realistic and immersive experience to musicians, it is critical to have reliable 3D models. This sprint focuses on finishing designing the 3D guitar model. This also includes having a firm grasp of the basics of Maya.
Create the beginning scene, linked to one of the buttons of startscreen.
Here, the program will guide you through how to navigate through different scenes and user interfaces, focusing especially on shooting rays and pointing.
This issues focuses on the initial set up of the scene by creating objects and the order of the tutorial.
This issue is so that we can use an external program to calculate tempo/score tracking, and feed the information to the Oculus, rather than doing calculations on the Oculus itself.
Currently, the backstage scene functionality is completed. A user is able to select a piece, select an instrument, move around in the scene, and move on to the front stage. However, the implementation looks clunky and unappealing. This issue is to create a more realistic backstage environment with more detail, to be more immersive.
This issue is to see if it is viable to animate character models using Unity's rigging tools, rather than using external software like Blender to animate the characters. If possible, this could streamline the syncing of the animations to the music.
Once the user is done playing, several buttons pop up to allow the users to play again or choose another music, or instrument. This is similar to the main stage UI, but detect that the music is over.
Establish a set of variables that hold throughout the different scenes in the project. These should keep track of the number of audience and orchestra models desired in the front stage, the type of score-following model to be used, etc.
This issue is for an alternative approach to the interactable music. Since score-following is a very complex subject, implementing a simpler tempo-tracking algorithm may be more reasonable.
Currently, the backstage environment consists of a basic wall with brick texturing, one table, and one door. However, a real backstage is much more complex. This task is to add features such as signs, props, other people, and general texture details to make a more realistic backstage environment.