Git Product home page Git Product logo

llv's Introduction

LLV

LLV enables you to record and play back live link frames, sent by Epic Games' ARKit face capture iOS app.

Issues and discussion

Checkout the wiki for more information on creating issues.

Quick start

Checkout this video, on how to setup LLV to be used with the MetaHuman example project.

Usage

Create or use recordings

Record

Listens for 256 incoming frames on all interfaces and standard port 11111 and writes the recording to a file named dao.gesichter.

python llv.py record --frames 256 --output dao.gesichter

Replay

Play one of the example recordings and send it to a host machine at 10.0.0.69 with implicit standard port of 11111 and 60 frames per seconds.

python llv.py play --host 10.0.0.69 examples/dao.gesichter

Inspecting or changing recordings

Recordings are stored as lines of base64 encoded frames. You can unpack recording files, to create a cleartext version, letting you inspect the frames as a json array. If you'd like to create your own frames by hand or script, you can pack it for the use with LLV.

Unpacking

python llv.py unpack examples/dao.gesichter dao.klare-gesichter

Packing

python llv.py pack dao.klare-gesichter dao.gesichter

Anatomy

Frame layout

The packet sizes of a frame are defined in the engine code as:

//                                         PacketVersion                    FrameTime                     BlendShapeCount Blendshapes                                        SubjectName             DeviceID
const uint32 MAX_BLEND_SHAPE_PACKET_SIZE = sizeof(BLEND_SHAPE_PACKET_VER) + sizeof(FQualifiedFrameTime) + sizeof(uint8) + (sizeof(float) * (uint64)EARFaceBlendShape::MAX) + (sizeof(TCHAR) * 256) + (sizeof(TCHAR) * 256);
const uint32 MIN_BLEND_SHAPE_PACKET_SIZE = sizeof(BLEND_SHAPE_PACKET_VER) + sizeof(FQualifiedFrameTime) + sizeof(uint8) + (sizeof(float) * (uint64)EARFaceBlendShape::MAX) +  sizeof(TCHAR)        +  sizeof(TCHAR);

This results in the minimum frame size being 264 bytes and the maximum being 774 bytes.

The layout is defined as:

  • PacketVersion -> 1 byte (uint8_t)
  • FrameTime -> 16 bytes (int32 + float + int32 + int32)
  • BlendShapeCount -> 1 byte (uint8_t)
  • List of blenshape values -> Blendshape Count * 4 bytes (float)
  • Subject Name -> Name Length * 1 byte (char)
  • Device ID -> ID Length * 1 byte (char)

There are a maximum of 61 blendshapes supported. See the apple ARKit docs for more info.

llv's People

Contributors

blurryroots avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.