Git Product home page Git Product logo

gabeweiss / google-sorting-demo Goto Github PK

View Code? Open in Web Editor NEW
13.0 3.0 0.0 162.83 MB

Source code (Python, Node.js and Java) for a demo we built which has been shown at a number of conferences, including IoT Solutions World Congress in Barcelona, Google Cloud Next 2019 and Google I/O 2019. Using the Coral Dev Board we show incredible fast machine learning on the edge with minimal power consumption.

License: Apache License 2.0

HTML 9.38% CSS 14.32% JavaScript 49.61% Python 25.96% Shell 0.73%
automl coral machine-learning demo

google-sorting-demo's Introduction

This repo contains the source code for a demo we created to show off machine learning on the edge using the Coral Dev Board with the Edge TPU.

Demo in all its glory

This is a video of it in action at the workshop where we were doing work on it just prior to Google Cloud Next 2019.

The source code is split into folders representing the different links in the architecture that piece together the whole thing. The demo uses a webcam, connected via USB-C to the Coral board, which is running a Python script pulling frames from the camera, and passing them through to our models. The models' output is then sent to a secondary machine (we had all them networked over a LAN with a router) which is running a Node.js script. That script does a few things:

  1. Business logic for interpreting the model output and deciding which bucket the gear is dropped into.
  2. Drives the Arduino which controls the mechanicals of the demo.
  3. Sends telemetry data to a Cloud Firestore instance which is then picked up by the Dashboard.

The Coral board also forwards the video stream from the webcam to a Python process running on the secondary machine. The streaming Python script on the secondary machine also saves off an image buffer of the last "captured" gear with bounding box displaying the missing gear detected, both available to be shown in the Dashboard.

Architecture diagram

android/Sorting_Demo: This is the folder for the Android application used at Google I/O 2019 to show off AutoML's model compiled down to mobile device and showing off the ability to use the same model on multiple formats of devices accurately.

base_training_images: Holder folder for any images we take and need to transfer between places for training our model. Heavily depending upon lighting changes, etc to be accurate.

dashboard: Front-end UI code for displaying the dashboard which is shown on the monitor attached to the demo.

coral: Code that lives and runs on the Edge TPU development board.

server: Code that lives and runs on the Windows Lenovo box. The node.js code that has all the business logic for what to do with the outputs from our AutoML model, as well as running the mechanicals of the demo itself by way of the Arduino.

imgs: Contain footage/images of the demo in action.

Running the demo

There is an order dependency on getting the demo running. Each of these requires running in its own terminal window as the processes keep running.

  1. Streaming server pass-through for dashboard feed
    1. server/stream_video.py
  2. Node server receiving telemetry from Coral board
    1. set the environment variable for the GOOGLE_PROJECT_ID to your GCP project containing the Firestore instance (if you don't want any telemetry data you can remove a lot of the code around that piece and forget the environment variable in this step)
    2. server/server.js
  3. Python Coral SDK script to do the actual inference on the Coral board
    1. coral/recognize.py (this has hardcoded model values in there, which you can change, or you can use the script flags to specify your own models as seen in coral/run_python.sh)
  4. Node server which handles serving up the Dashboard JavaScript page
    1. dashboard/server.js (this is only necessary if you want to be running the dashboard)

google-sorting-demo's People

Contributors

dependabot[bot] avatar dizcology avatar gabeweiss avatar hongalex avatar nnegrey avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.