Git Product home page Git Product logo

20w-ensemble's Introduction

Ensemble

  • Ensemble is a VR experience that places the user in a realistic concert environment with an audience and ensemble. The musician is able to practice performing a piece in front of a simulated audience, to help combat stage fright.

Architecture

  • The frontend and all visual aspects are implemented using Unity3D and compatible C# scripts. Autodesk Maya is used for modeling and animations.
  • The backend will be coded in Python in a separate companion app. This will allow the visual aspect of the project to run at full capacity without worry of computational exhaustion as a result of numerous probability calculations. The backend repository can be found at: https://github.com/dartmouth-cs98/20w-ensemble-vr-score-following

Setup

Deployment via Unity3D

  • Clone this repository into any folder. Then, open the project in Unity! An Oculus Quest is required to run the application, so plug in an Oculus Quest into your computer via USB Type-C cable. Enable USB debugging on the Oculus Quest.
  • Go to Build Settings, and ensure that it is building to Android. Go to player settings and ensure that Quest is set as the build target. Then, click Build and Run, save the .apk into any folder you would like, and enjoy!

Deploying the .apk

How to Navigate the Game

  • Once the game is started, point the ray shooting out of your right hand at the "Play" button and pinch your index fingerand thumb together.
  • There is a also a tutorial you can use to practice the interactive hands.
  • Once you begin, you will be taken to the backstage room. There, flip through the music book pages until you find a piece you want. Currently, there are only 3 pieces to choose from (Pachelbels Canon, Let It Go, and Beethoven's Symphony No. 9).
  • Once you are flipped to the right piece, turn right and pinch on the door to go to the main stage.
  • Pinch on the green play button on the podium to start the music.
  • Practice to your heart's content!
  • The Beethoven 5th Symphony (last piece in the book) is a short 10 second snippet, so choose this for a quick demonstration!

Notes on Score-Following

  • In the Oculus game, you can change the accompaniment from static (doesn't follow you) to score-following. To use score-following, you must have the Python server running. Then, you must type in the IP address in the Oculus game itself. Currently, the score-following mode works, but the response is delayed by around 0.5 seconds, which makes for a really awkward experience.
  • Use the link to the score-following repo above and follow directions on it's README to set up the Python server.

Authors

  • James Lee
  • Bryan Shin
  • Marshall Peng
  • Soohwan Park
  • Myles Holt
  • Ryan Hyun

Acknowledgments

20w-ensemble's People

Contributors

bryanshin1997 avatar jamesjinlee avatar marshallpeng avatar myholt22 avatar ryanhyun20 avatar soohwan0905 avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar

Forkers

robert-he

20w-ensemble's Issues

Start Screen Revamp: UI

Work on a new design and layout for Start Screen UI. Right now, it is not very aesthetically pleasing.

Audience Clapping Sync

This issue is to create and sync the audience clapping sounds with the start and end of the performance.

User Interface Epic

From Start Interface, to setting, End Interface, this epic is responsible for guiding the user through the program from start to end.

Implement Various Hand Gestures for Interactions

Currently, the game only supports hand interactions in the form of touching objects with the fingertips. However, often times, objects are out of reach of the player's game area, so we need a way to interact with objects from a distance. This involves ray projections and hand-gesture recognition to understand when a user wants to interact with the object and when he/she does not.

Guitar Body Modeling

In order to provide realistic and immersive experience to musicians, it is critical to have reliable 3D models. This sprint focuses on building the scaffolding for the 3D guitar model, focusing on the guitar body. This also includes learning the basics of Maya.

Refine Instrument Models

The current instrument models are crude and overly simplified. This task is to add more faces and vertices to the mesh and more accurate textures to more accurately mimic the instruments.

Implement Tempo Change

Write a script to change the tempo of an audio source in Unity without affecting pitch. This should prove useful if we decide to adapt a tempo adjustment approach in our accompaniment.

Implement General Note Playback Script

Write and test a script that plays a tone given a key, duration, and instrument. It should be fairly simple to lessen the load on the Oculus Quest. Make use of the audio components in Unity.

Study Tempo-Tracking

This issue is to continue studying the tempo-tracking literature to find a paper with a tested algorithm, and then figure out how to best implement the algorithm

Integrate Architecture from Toy Frontstage

For week 1, I used a toy scene to work on the concert hall architecture. The toy scene is separate from the frontstage scene, so I will have to integrate my work from the toy scene into frontstage

Real-Time Oculus Running

Set up real-time oculus integration so that when I add edit a script or add an object, I can see the changes in real time. This will be useful for all the upcoming issues, especially hand gesture detection.

Integrate Backstage Interaction with Rest of Game

Right now, each component of the backstage acts independently from the rest of the game. This task is to have the music selection, instrument selection, and back-to-front stage doorknob integrate with the game.

Main Stage Basic UI

There are several features that support user's immersive experience as he's playing along with Ensemble. There are basic motion features that allow the user to move around the stage. These actions are incorporated within the 'pause' menu that allow users to 'restart' or 'move backstage'. This pause itself is triggered by another button that is easily visible to the user.

  • Have the music playing in the background that pauses when the user triggers it --> "replay" + "start" buttons pop up --> follow their functions

Refine Instrument Models

Several instruments have been roughly modeled. This issue focuses on refining those instruments (Guitar) so that they are ready to be staged in a week, specifically the strings of the guitar.

Integrate the Note Playback with Accompaniment

Write a script that holds a collection of note playback objects dedicated to specific instruments and calls those objects when prompted to do so. This functions as the controller for accompaniment playback.

Tutorial Design/UI,UX

Refine tutorial design, including the background.
Not a stage in the middle of nowhere, but on a field.

Audience seating model

This is an example of a chair in which the virtual audience sits. This has to be set up correctly to portray an accurate representation of how a real audience is seated in a performance.

Bryan Shin | Personal Success

Bryan Shin | Personal Goals

what does success for this class mean to you?

  • I am able to fully understand and implement one of the many score following/tempo tracking algorithms described in online research papers
  • I am able to show Ensemble to musician peers, and see that this application not only works but is relevant and desirable.
  • I release this application on the VR market for the masses to download and try out.

Challenges

what are some challenges you foresee?

  • A lot of the research papers online use difficult technical jargon and concepts, which I am unfamiliar with.
  • This application may be difficult to release officially because it is targeted at such a specific audience of soloist musicians.
  • It will be hard to user-test our application because not many people have access to VR headsets. It will be even harder to find musician peers with VR headsets.

Animate the Instruments

This issue is to animate and sync the movement of the instruments with the orchestra member models.

Refine User Interaction in Backstage

During Technigala, we found that the user interaction with the backstage was unintuitive. This issue is to figure out a way to make it more seamless and easy for the user.

Refine User Interactions with the Backstage

Currently, the backstage scene functionality is completed. A user is able to select a piece, select an instrument, move around in the scene, and move on to the front stage. However, the interactions feel clunky and unintuitive. This issue is to touch up the interactions with the music book, instruments, movement tiles, etc to make a more immersive and intuitive interaction experience

Tutorial

This Epic is for the tutorial scene that teaches a user the basic steps to navigating the application.

MIDI Implementation

First, familiarize yourself to the basics of MIDI, wav, and how Unity handles music. Then, import sample classical music files to the main stage scene in Unity. This starts happening after the backstage. If the music is over, ending UI pops up to tell the user that he can either replay or choose new music.

Split Up Scenes

To help with merge conflicts, this issue is to split up each of our scenes into multiple portions such that more than 1 person can work on a room at once. For example, front stage could be split up into the front stage models scene and the front stage UI scene.

Personal Success: Soohwan

Soohwan Park | Personal Goals

what does success for this class mean to you?

  • I am very comfortable with Unity and C# to the level where I can dive into VR game development afterwards
  • I get Hand Tracking working because that's one of the newest experimental features on Oculus that will be widely used from here on out
  • I know what a user-friendly UI looks like in VR after seeing users smoothly run through the full extent of the environment
  • I know how to manipulate music in VR

Challenges

what are some challenges you foresee?

  • Potential merge conflicts hindering our progress
  • Getting behind on where we want to be and not having a solid trajectory

Finish Guitar Modeling

In order to provide realistic and immersive experience to musicians, it is critical to have reliable 3D models. This sprint focuses on finishing designing the 3D guitar model. This also includes having a firm grasp of the basics of Maya.

Start Creating Tutorial Scene

Create the beginning scene, linked to one of the buttons of startscreen.
Here, the program will guide you through how to navigate through different scenes and user interfaces, focusing especially on shooting rays and pointing.
This issues focuses on the initial set up of the scene by creating objects and the order of the tutorial.

Further Refine Backstage Appearance

Currently, the backstage scene functionality is completed. A user is able to select a piece, select an instrument, move around in the scene, and move on to the front stage. However, the implementation looks clunky and unappealing. This issue is to create a more realistic backstage environment with more detail, to be more immersive.

Instrument Database Expansion

Generate and load more samples to the project. Currently, there are only stringed instrument samples. Use many different VSTs for this.

Texturize Architecture

The current architecture looks dull due to the solid colors. Adding textures will make the architecture look more real

Study Score-Following

This issue is to study the literature on score-following so that I can next try to implement the theories.

Implement Basic Character Animation in Unity

This issue is to see if it is viable to animate character models using Unity's rigging tools, rather than using external software like Blender to animate the characters. If possible, this could streamline the syncing of the animations to the music.

Ending UI

Once the user is done playing, several buttons pop up to allow the users to play again or choose another music, or instrument. This is similar to the main stage UI, but detect that the music is over.

Persistent Changes

Establish a set of variables that hold throughout the different scenes in the project. These should keep track of the number of audience and orchestra models desired in the front stage, the type of score-following model to be used, etc.

Ryan Hyun | Personal Success

Ryan Hyun | Personal Goals

what does success for this class mean to you?

  • I contribute to a meaningful product that can enhance other people's quality of life.
  • I learn how to use new technologies and become better at managing my work.
  • I become more industrious and able to handle unfamiliarity and stress in a technical or personal context.

Challenges

what are some challenges you foresee?

  • My lack of experience with many technologies could mean lots of learning on the fly.
  • I may want to bring the project in directions that may contrast with the views of other members of the team.
  • In a time crunch, I tend to have trouble getting things done, so I need to finish my work on time and be efficient.

Study Tempo-Tracking

This issue is for an alternative approach to the interactable music. Since score-following is a very complex subject, implementing a simpler tempo-tracking algorithm may be more reasonable.

Refine the Backstage Virtual Environment

Currently, the backstage environment consists of a basic wall with brick texturing, one table, and one door. However, a real backstage is much more complex. This task is to add features such as signs, props, other people, and general texture details to make a more realistic backstage environment.

Generate and Upload Sound Files

Create sound files for necessary instruments such as strings, winds, etc. Upload these to Unity as assets for use in note playback.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.