Git Product home page Git Product logo

robot's Introduction

robot

This will run a simple robot with a webserver on a raspberry PI with the Adafruit Motor Hat. I wrote this up for myself for fun and to help me remember how I set things up.

High level overview can be found in this article: https://www.oreilly.com/learning/how-to-build-a-robot-that-sees-with-100-and-tensorflow

Hardware

To get started, you should be able to make the robot work without the arm, sonar and servo hat.

Programs

  • robot.py program will run commands from the commandline
  • sonar.py tests sonar wired into GPIO ports
  • wheels.py tests simple DC motor wheels
  • arm.py tests a servo controlled robot arm
  • autonomous.py implements a simple driving algorithm using the wheels and sonal
  • inception_server.py runs an image classifying microservice

Example Robots

Here are two robots I made that use this software

Robots

Wiring The Robot

Sonar

If you want to use the default sonar configuation, wire like this:

  • Left sonar trigger GPIO pin 23 echo 24
  • Center sonar trigger GPIO pin 17 echo 18
  • Right sonar trigger GPIO pin 22 echo 27

You can modify the pins by making a robot.conf file.

Wheels

You can easily change this but this is what wheels.py expects

  • M1 - Front Left
  • M2 - Back Left (optional - leave unwired for 2wd chassis)
  • M3 - Back Right (optional - leave unwired for 2wd chassis)
  • M4 - Front Right

Installation

basic setup

There are a ton of articles on how to do basic setup of a Raspberry PI - one good one is here https://www.howtoforge.com/tutorial/howto-install-raspbian-on-raspberry-pi/

You will need to turn on i2c and optionally the camera

raspi-config

Next you will need to download i2c tools and smbus

sudo apt-get install i2c-tools python-smbus python3-smbus

Test that your hat is attached and visible with

i2cdetect -y 1

Install this code

sudo apt-get install git
git clone https://github.com/lukas/robot.git
cd robot

Install dependencies

pip install -r requirements.txt

At this point you should be able to drive your robot locally, try:

./robot.py forward

server

To run a webserver in the background with a camera you need to setup gunicorn and nginx

nginx

Nginx is a lightway fast reverse proxy - we store the camera image in RAM and serve it up directly. This was the only way I was able to get any kind of decent fps from the raspberry pi camera. We also need to proxy to gunicorn so that the user can control the robot from a webpage.

copy the configuration file from nginx/nginx.conf to /etc/nginx/nginx.conf

sudo apt-get install nginx
sudo cp nginx/nginx.conf /etc/nginx/nginx.conf

restart nginx

sudo nginx -s reload

gunicorn

install gunicorn

copy configuration file from services/web.service /etc/systemd/system/web.service

sudo cp services/web.service /etc/systemd/system/web.service

start gunicorn web app service

sudo systemctl daemon-reload
sudo systemctl enable web
sudo systemctl start web

Your webservice should be started now. You can try driving your robot with buttons or arrow keys

camera

In order to stream from the camera you can use RPi-cam. It's documented at http://elinux.org/RPi-Cam-Web-Interface but you can also just run the following

git clone https://github.com/silvanmelchior/RPi_Cam_Web_Interface.git
cd RPi_Cam_Web_Interface
chmod u+x *.sh
./install.sh

Now a stream of images from the camera should be constantly updating the file at /dev/shm/mjpeg. Nginx will serve up the image directly if you request localhost/cam.jpg.

tensorflow

There is a great project at https://github.com/samjabrahams/tensorflow-on-raspberry-pi that gives instructions on installing tensorflow on the Raspberry PI. Recently it's gotten much easier, just do

wget https://github.com/samjabrahams/tensorflow-on-raspberry-pi/releases/download/v0.11.0/tensorflow-0.11.0-cp27-none-linux_armv7l.whl
sudo pip install tensorflow-0.11.0-cp27-none-linux_armv7l.whl

Next start a tensorflow service that loads up an inception model and does object recognition the the inception model

sudo cp services/inception.service /etc/systemd/system/inception.service
sudo systemctl daemon-reload
sudo systemctl enable inception
sudo systemctl start inception

robot's People

Contributors

lukas avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

robot's Issues

There is a problem with video streaming to webserver.

I'm able to open cam.jpg independently in the browser (http://localhost/cam.jpg). But the webserver is not able to get the image (there is the error "HTTP 404"). The control of the motor is working on webserver.
I have installed RPi_Cam_Web_Interface with the default port (leaved it empty) and with the nginx option (instead of apache).
5
1
2
3
4

motor hat blocks GPIO pins

hi.... I have an problem in interfacing sonar sensor with Raspberry pi 3 model B....The motor hat blocks all the GPIO pins.... what is the best way for connecting sonar sensor....

regards,
Ganesh

Following your Tut

I spent the $1100 (I got a GTX 1070 w/ 8 GB) based on your tut here: https://www.linuxdevcenter.com/learning/build-a-super-fast-deep-learning-machine-for-under-1000

I'm not doing a robot, just a straight feed via nginx and gunicorn from a Pi.

I can't seem to get Yolo and Darknet working together no matter what I try, including your directions here. Darknet doesn't seem to like the image sequence the Pi is producing from http://192.168.1.xx/cam.jpg (which is constantly being overwritten) using this command:

./darknet detector test cfg/coco.data cfg/yolo.cfg yolo.weights http://192.168.1.xx/cam.jpg

Do you have any ideas?

fast inference?

Hi @lukas thank you for writing the blog post, im using your blog post to build a robot. I noticed in your youtube video, it takes about 10 seconds to do inference. I didnt compile tensorflow by hand, i used the pip here: https://github.com/samjabrahams/tensorflow-on-raspberry-pi and I ran your classify_image.py script , it takes 50 seconds. Im on raspberry 3b. Did you do anything special to get inception to run so fast?

Flite software package and tensorflow

Hi Lukas,
Could you let me know the details of how Flite software is linked with tensorflow? I have followed your directions and at this time I can see what my camera is seeing through localhost/cam.jpg. However, I did not see directions in the article how to install Flite so that it can speak the objects that tensorflow is outputting. If you can provide directions as to the installation and how the two programs can be linked, it will be great.
Also, thank you so much for sharing this work and the article.

Your sincerely,
John Varkey

How to power and connect HC-SR04 sonar sensors

@lukas ,
was trying to connect 3 HC-SR04 same like you did and wondering how can I connect them.
in your instruction, you mentioned which GPIO ports we can use but wondering how you powered then !? did you used Raspberry pi on-board 5v for 3 sensors!? OR powered then with externally battery back !?

Any in-site and information much appreciated.

Regards
Nagen

"robot.conf" missing

Could you please upload the "robot.conf" file?

File "/home/pi/Downloads/robot/configure.py", line 5, in
with open('robot.conf') as data_file:
IOError: [Errno 2] No such file or directory: 'robot.conf'

Correct way of connecting the HC-SR04 sensors?

Hi,
I'm curious as to how people have connected the HC-SR04 sensors to their Raspberry Pi's. I assume that you cannot connect them directly to the Pi's GPIO's through the Adafruit Motor HAT. So is a breadboard necessary? And are there any more details on how to go about this?

Thanks,
Patrick.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.