Git Product home page Git Product logo

polaris's Introduction

polaris~ - Developing open-source multisensory AR instruments / experiences.

If an AR system can be thought of as one that combines real and virtual processes, is interactive in real-time, and is registered in three dimensions; why do we witness the majority of AR applications utilising primarily visual displays of information? I propose a practice-led compositional approach for developing multisensory AR experiences’, arguing that, as an medium that combines real and virtual multisensory processes, it must explored with a multisensory approach.

This project uses the open-souce Project North Star HMD from Leap Motion alongside bone-conduction headphones to deliver spatialised audio-visual experience via Unity. My experiences creating and developing this project can be found on my website under polaris~ in the project section.

This repository is a fork of Project Esky that includes

  • Project Esky: a software companion for the Project North Star open-source AR headset that allows developing Unity scenes with MRTK/Leap assets.
  • LibPdIntegration: a wrapper for libpd that allows for the implementation of Pure Data patches into Unity
  • Automatonism: a library of Pure Data Vanilla patches that emulate modules of a synthesizer.
  • A set of example scripts and scenes that use them to demonstrate possible interactions between head/hand tracking and patch parameters in Pd, with the chief aim of creating a set of expressive multisensory AR instruments / experiences.

Features

Hardware features

  • Six degrees-of-freedom (3D position / orientation) head tracking via Intel T261
  • 90 fps, 170° hand tracking via Ultraleap
  • Single piece optical combiner allowing for up to 110° horizontal FoV
  • 2x 120Hz displays per-eye for a total resolution of 2880x1600
  • 2x 3-metre cables (1x miniDP, 1x USB-A 3.1)
  • Spatial audio AR (the ability to hear localised sound whilst being able to hear your real audio environment) via Unity3D and Aftershokz Aeropex bone conduction headphones.

Engine (Unity3D / Project Esky) features

  • The ability to create 3D scenes that contain 'GameObjects' that in turn can have visual attributes such as 3D meshes, material colours, and textural properties; physical attributes such as edges, position, mass, velocity and real-time parameterisation via C# scripting.
  • Thanks to Project Esky, the headset is created as a GameObject with real-time position / orientation.
  • Thanks to LeapMotion, hands (all the way down to individual finger joints) are created as GameObjects with real-time position / orientation relative to the headset.

Audio (LibPdIntegration/Pd) features

  • LibPdIntegration uses native Unity3D audio spatialisation. This is great because it means that a GameObject can output the signal of a Pd patch whilst moving, rotating and scaling. The effect of these can perceived in real-time because the AudioListener is anchored to the real-time headset position. This, for example, means that the volume of a Pd patch whose signal is being transmitted from a GameObject located in space is automatically scaled dependent on its distance to the participants head (quieter as it gets further away, louder as it is brought closer).
  • LibPdIntegration can 'instance' Pd patches, meaning it can use one patch on multiple GameObjects, but maintain processes like randomness within them as they are technically different 'instances' or versions of the patch.
  • Pure Data allows extended audio techniques through an extensive library of algorithmic 'objects' that can create and manipulate audio signals.
  • LibPdIntegration allows real-time parameter control in Unity of any object in a Pd patch via "receive" objects and a specific C# method.
  • The combination of "Play Mode" toggling in Unity, and the quick visual patching style of Pure Data means that audio-visual interactions can be prototyped very rapidly

Caveats

Although there is a great deal that could be done to ensure equity of computational power around the planet, polaris~ strives to use completely open-source, free and, where possible cross-platform, elements in order to increase accessibility to cutting-edge AR technologies for those who often can't afford the mainstream consumer alternatives. The only exceptions to this so far is:

  • polaris~ requires a computer to create the scenes in Unity.
    • Cheaper wearable compute packs are currently being prototyped by CombineReality.
  • Project Esky being Windows only. This is because there is no up-to-date MacOS Ultraleap (hand tracking) driver.
    • Ultraleap are working on this for their V5 release (some time in 2021), but would still require a Project Esky rewrite as far as I know
  • The firmware code for the head-tracker is not open-sourced by Intel and the product range has just been discontinued!! Whilst this is not necessary for the project, it could have allowed further interaction.
    • Some geniuses are working on implementing open-source modular Luxonis cameras to replace the head-tracking sensor that the project currently relies on Intel for
  • My audio AR solution, which is currently a set of wireless Aftershokz Aeropex bone conduction headphones are pricy.
    • I am very interested in desigining a bone conduction transducer design which is mounted to the headset itself, similar to how the Vive VR headset has speakers that are attached to its main body. Alternatively, cheaper bone-conduction headphones can be used.

Quickstart

Coming Soon

setting up a new instrument/experience project

deciding on your hands

how to use libpdintegration

using libpdintegration with automatonism

Inspiration and Similar Projects

  • Listening Mirrors: an audio AR interactive installation by my PhD supervisors
  • Laetitia Sonami: pioneer in early glove-based interactive music systems
  • Atau Tanaka: interactive gestural synthesis using muscle sensors
  • Keijiro Takahashi specifically their work with audio-reactivity in Unity.
  • Tekh:2 with their work with XR instruments using granular synthesis in Unity.

Acknowledgements

  • Noah Zerkin (CombineReality) for their help in understanding some specifics workings of the North Star headset.
  • Damien Rompapas (BEERLabs / ThinkDigital) for their explaining and debugging of Project Esky to me.
  • Bryan Chris Brown (CombineReality) for their moderation of the very friendly Discord server and considerable explanations of the benefits of working with the North Star headset.

Credits

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.