Git Product home page Git Product logo

pyomyo's Introduction

PyoMyo

Python module for the Thalmic Labs Myo armband.

Cross platform and multithreaded and works without the Myo SDK.

pip install pyomyo

Documentation is in the Wiki, see Getting Started.

Playing breakout with sEMG

PyoMyo Documentation

Home
Getting started
Common Problems
Myo Placement

The big picture

Why should you care?
Basics of EMG Design

Links to other resources

Python Open-source Myo library

This library was made from a fork of the MIT licensed dhzu/myo-raw. Bug fixes from Alvipe/myo-raw were also added to stop crashes and also add essential features.

This code was then updated to Python3, multithreading support was added then more bug fixes and other features were added, including support for all 3 EMG modes the Myo can use.

Note that sEMG data, the same kind gathered by the Myo is thought to be uniquely identifiable. Do not share this data without careful consideration of the future implications.

Also note, the Myo is outdated hardware, over the last year I have noticed a steady incline in the cost of second hand Myos. Both of my Myo's were bought for under £100, I do not recommend spending more than that to acquire one. Instead of buying one you should join the discord to create an open hardware alternative!

Included Example Code

The examples sub-folder contains some different ways of using the pyomyo library.

git clone https://github.com/PerlinWarp/pyomyo

plot_emgs_mat.py

Left to Right Wrist movements.

Starts the Myo in mode 0x01 which provides data that's already preprocessed (bandpass filter + rectified).
This data is then plotted in Matplotlib and is a good first step to see how the Myo works.
Sliding your finger under each sensor on the Myo will help identify which plot is for sensor.

dino_jump.py

Chrome Dinosaur Game

An example showing how to use the live classifier built into pyomyo, see Getting Started for more info.

myo_multithreading_examp.py

Devs start here.
This file shows how to use the library and get Myo data in a seperate thread.

Myo Modes Explained

To communicate with the Myo, I used dzhu's myo-raw. Then added some functions from Alvipe to allow changing of the Myo's LED.

emg_mode.PREPROCESSED (0x01)
By default myo-raw sends 50Hz data that has been rectified and filtered, using a hidden 0x01 mode.

emg_mode.FILTERED (0x02)
Alvipe added the ability to also get filtered non-rectified sEMG (thanks Alvipe).

emg_mode.RAW (0x03)
Then I further added the ability to get true raw non-filtered data at 200Hz. This data is unrectified but scales from -128 and 127.

Sample data and a comparison between data captured in these modes can be found in MyoEMGPreprocessing.ipynb

The library

pyomyo.py

Prints sEMG readings at 200Hz straight from the Myo's ADC using the raw EMG mode.
Each EMG readings is between -128 and 127, it is the most "raw" the Myo can provide, however it's unlikely to be useful without extra processing. This file is also where the Myo driver is implemented, which uses Serial commands which are then sent over Bluetooth to interact with the Myo.

Classifier.py

Implements a live classifier using the k-nearest neighbors algorithm.
Press a number from 0-9 to label incoming data as the class represented by the number.
Press e to delete all the data you have gathered.
Once two classes have been made new data is automatically classified. Labelled data is stored as a numpy array in the data\ directory.

pyomyo's People

Contributors

bit4qu4 avatar perlinwarp avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pyomyo's Issues

Add bleak backend to allow people without dongles to use pyomyo

The BLED112 dongle allows the Myo to work on any operating system as it acts as a COM port that forwards serial commands send to it over Bluetooth.

One reason why it's worth doing this is due to the poor cross platform BLE support, however bleak (Bluetooth Low Energy platform Agnostic Klient) may solve this and allow people without a dongle to use pyomyo.

This feature would also act as a stepping stone for makers of open source BLE EMGs to add support for their hardware to pyomyo.

Thankfully, Thalmic Labs released (most of the) the specification here.

IMU and EMG data

Hi
Is there a way that I collect both IMU and EMG data at the same time?

Serial Read leaves program hanging on Raspberry Pi

Hi,
I am running a project that uses the Myo armband to control several servos. I have noticed that sometimes the Serial read on line 124 of the pyomyo.py file does not complete or timeout and therefore the program hangs.

Perhaps setting a timeout on the Serial may help avoid this?

Provide an implementation of common metrics to help with classification and regression

Both feature engineering and preprocessing of EMG data will likely be added for any significantly difficult project using pyomyo. If so it would be nice to contribute these implementations back so that efficiency improvements/bug fixes can be shared.
It would be nice to include a metrics.py file containing implementations and a plot_metrics.py file that can give some intuition on what each feature captures.

Some examples of metrics to include are Mean Absolute Value (MAV), Root Mean Square (RMS), Willison Amplitude (WAMP), Waveform Length (WL) and Zero Crossings (ZL). More info can be found in this paper for different metrics.
My hope was to implement them and create a notebook showing their feature importance's for different tasks and models but have not had the time so far.

Stuck after run for over 3 minutes

Hi guys, I wonder did anyone encountered with after run self.run method for minutes, no data came out then. I have done some debugger work, and found that it's the line c = self.ser.read() in the method recv_packet of the BT class stuck.
Any ideas? Is the device disconnected after certain time?

def recv_packet(self):
  n = self.ser.inWaiting() # Windows fix
  
  while True:
	  c = self.ser.read()
	  if not c:
		  return None
  
	  ret = self.proc_byte(ord(c))
	  if ret:
		  if ret.typ == 0x80:
			  self.handle_event(ret)
			  # Windows fix
			  if n >= 5096:
				  print("Clearning",n)
				  self.ser.flushInput()
			  # End of Windows fix
		  return ret

Position Calibration tools to minimise cross session variance.

Sensor positioning is hard and solving it is part of solving cross session generalisation, although there are fancy techniques that can be performed after the data is gathered, it would be worth having a discussion on the simple things. Here are some of my rough thoughts:

Calibration Methods

Once a placement is chosen, originally I would sharpie an outline of the whole Myo on my hand. It's important to mark the position of each sensor pod, not just one. The Myo can both rotate around the arm, move up and down, or tilt up and down.

One simple way I calibrate is by placing the Myo on my proximal forearm of my right hand, waving my wrist right and then making sure this movement only peaks one channel, e.g. channel 3.
If the movement peaks two channels, I know the relevant muscle is in-between 2 EMG sensors and then rotate the Myo slightly to minimise cross talk and maximise the amount of the signal only picked up by channel 3.

Once this placement was used a kNN classifier can be trained on relevant gestures, e.g. 5 finger flexion. Then next session, after manual calibration is attempted it can be fine tuned until the kNN classifier works.

Search for something better, or at least more scientific

Some tools could be made to tell the user to rotate the Myo and show the user how similar the readings are to the previous data gathered.
Adding live PCA plotting to the live classifiers with a print out of between/within group variance may help quickly tryout different positioning for different gestures.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.