Git Product home page Git Product logo

ml-samd21-iot-sensiml-gestures-demo's Introduction

https://www.microchip.com/

Gesture Recognition with SensiML

Deployed gesture recognizer
Deployed gesture recognizer

Repository Overview

This repository is a companion to the Gesture Recognition with SensiML tutorial on the Microchip Developer website. It contains the firmware to classify a few different motion gestures on a SAMD21 Machine Learning Kit with the Bosch BMI160 IMU (Mikroe IMU2 click board) or with the TDK ICM42688 IMU (Mikroe IMU14 click board).

The supported gestures (shown in the video above) are:

  • Figure Eight
  • Up-down
  • Wave
  • Wheel

In addition there is also an 'unknown' class for gesture-like movement and an 'idle' class for low motion activity.

Continuous Gestures Dataset

The gestures dataset was collected by Microchip employees and consists of two test subjects performing the continuous gestures as described in the section above with a SAMD21 BMI160 evaluation board. The dataset can be downloaded from the releases page - it includes a collection of 10 second long samples in CSV format (ax,ay,az,gx,gy,gz format) split into training and test folds. CSV files are named according to the following template:

<class>-<participant-id>-<extra-metadata>-<collection-date-yymmdd>-<sample-number>.csv

A DCLI descriptor file is also included for each fold for easy import into SensiML's Data Capture Lab.

In addition to the target gestures, some additional gestures - triangle, forward wheel, the letter V, and others - were collected to make up the unknown gestures class, which is used to help improve and validate the models discriminatory ability.

Furthermore, the idle class data consists of scenarios where the device is fully at rest in different orientations, and other scenarios with small motion activity that included fidgeting with the board (manipulating the board randomly with the fingers) and pacing around the room while holding the board.

Hardware Used

  • SAMD21 Machine Learning Evaluation Kit with Bosch BMI160 IMU (EV45Y33A)
  • SAMD21 Machine Learning Evaluation Kit with TDK ICM42688 IMU (EV18H79A)

Software Used

Related Documentation

Firwmare Operation

The firmware can be thought of as running in one of five states as reflected by the onboard LEDs and described in the table below:

State LED Behavior Description
Error Red (ERROR) LED lit Fatal error. (Do you have the correct sensor plugged in?).
Buffer Overflow Yellow (DATA) and Red (ERROR) LED lit for 5 seconds Processing is not able to keep up with real-time; data buffer has been reset.
Idle Yellow (DATA) LED lit Board is in idle state.
Unknown Gesture No LEDs lit Gesture-like motion detected, but not a known gesture.
Recognized Gesture Single LED flashing according to gesture class One of the known gestures has been detected.

When a gesture is recognized by the firmware, one of the onboard LEDs will begin flashing quickly according to the gesture class; this mapping is shown in the image below.

Gesture class to LED mapping
Gesture class to LED mapping

In addition, the firmware also prints the classification output for each inference over the UART port. To read the UART port use a terminal emulator of your choice (e.g., MPLAB Data Visualizer's integrated terminal tool) with the following settings:

  • Baudrate 115200
  • Data bits 8
  • Stop bits 1
  • Parity None

A sample of the terminal output is shown in the figure below.

Terminal output
UART Terminal Output

Note the firmware class ID mapping is as below:

  • Unknown - 0 (input outside of modeled behavior)
  • Figure Eight - 1
  • Idle - 2
  • Unknown Gesture - 3 (unknown gesture-like behavior)
  • Up-down - 4
  • Wave - 5
  • Wheel - 6

The UART output can also be visualized with the SensiML Open Gateway application:

  1. Open a terminal and change to the directory where you've checked out this repository.
  2. Clone the open-gateway repository and install the dependencies:

    git clone https://github.com/sensiml/open-gateway
    pip install -r open-gateway/requirements.txt

  3. Change the baudrate (BAUD_RATE variable) in open-gateway/config.py to 115200
  4. Change to the open-gateway directory and run the open-gateway application, passing in the fan demo's model.json description file:

    cd open-gateway
    python app.py -m firmware/knowledgepack/libsensiml/model.json

  5. Connect to the SAMD21 board in the gateway application:
    • Select the Recognition device mode.
    • Select Serial connection type, and enter the UART address (e.g. COM4) in the Device ID field.
    • Click Connect To Device.
  6. Switch to the Test Mode tab and click Start Stream.

Performing Gestures

Gestures should be performed in a way that feels natural, using a thumb and index finger grip around the SAMD21 board as shown in the image below. The top of the board should be facing away from the user, with the USB connector oriented towards the ground.

Thumb and index finger grip
Thumb and index finger grip

The supported gestures are listed below (described from user's point of view):

  • Figure Eight - Move the board in a figure eight pattern, starting the gesture from the top of the eight and going left (counterclockwise) at a slow to moderate speed
  • Up-down - Move board up and down continuously at a moderate speed
  • Wave - Wave the board side to side at a moderate speed as if you were greeting someone
  • Wheel - Move the board in a clockwise circle (or wheel) continuously, at a moderate speed

Also see the GIF at the top of this document for further reference.

Firmware Benchmark

Measured with the BMI160 sensor configuration, -O2 level compiler optimizations, and 48MHz clock

  • 49.5kB Flash
  • 5.1kB RAM
  • 9ms Inference time

Classifier Performance

Below is the confusion matrix for the test dataset. Note that the classes are highly imbalanced so accuracy is not a good indicator of overall performance.

Test set confusion matrix

Sensor Configuration

Binary builds of the data logging firmware used in the data collection for this project can be downloaded from the releases page; see the ml-samd21-iot-imu-data-logger repository to build data logging firmware with different sensor configurations.

Sensor configuration values used in the data collection are summarized in the table below. Note that only accelerometer data was used in training the classifier.

IMU Sensor Axes Sampling Rate Accelerometer Range Gyrometer Range
Bosch BMI160 Ax, Ay, Az, Gx, Gy, Gz 100Hz 16G 2000DPS

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.