Git Product home page Git Product logo

jetbot_ros's Introduction

jetbot_ros

ROS nodes and Gazebo model for NVIDIA JetBot with Jetson Nano

System Configuration

It is assumed that the Nano has been setup with JetPack 4.2 and that CUDA, cuDNN, and TensorRT have been installed.

Note: the process below will likely exceed the disk capacity of the default 16GB filesystem,
          so a larger SD card should be used. If using the 'Etcher' method with JetPack-L4T image,
          the APP partition will automatically be resized to fill the SD card upon first booting the system.
          Otherwise flash with L4T using the -S option (example given for 64GB SD card):
                     sudo ./flash.sh -S 58GiB jetson-nano-sd mmcblk0p1

Install ROS Melodic

# enable all Ubuntu packages:
$ sudo apt-add-repository universe
$ sudo apt-add-repository multiverse
$ sudo apt-add-repository restricted

# add ROS repository to apt sources
$ sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'
$ sudo apt-key adv --keyserver hkp://ha.pool.sks-keyservers.net:80 --recv-key 0xB01FA116

# install ROS Base
$ sudo apt-get update
$ sudo apt-get install ros-melodic-ros-base

# add ROS paths to environment
sudo sh -c 'echo "source /opt/ros/melodic/setup.bash" >> ~/.bashrc'

Close and restart the terminal.

Install Adafruit Libraries

These Python libraries from Adafruit support the TB6612/PCA9685 motor drivers and the SSD1306 debug OLED:

# pip should be installed
$ sudo apt-get install python-pip

# install Adafruit libraries
$ pip install Adafruit-MotorHAT
$ pip install Adafruit-SSD1306

Grant your user access to the i2c bus:

$ sudo usermod -aG i2c $USER

Reboot the system for the changes to take effect.

Create catkin workspace

Create a ROS Catkin workspace to contain our ROS packages:

# create the catkin workspace
$ mkdir -p ~/workspace/catkin_ws/src
$ cd ~/workspace/catkin_ws
$ catkin_make

# add catkin_ws path to bashrc
$ sudo sh -c 'echo "source ~/workspace/catkin_ws/devel/setup.bash" >> ~/.bashrc'

Note: out of personal preference, my catkin_ws is created as a subdirectory under ~/workspace

Close and open a new terminal window. Verify that your catkin_ws is visible to ROS:

$ echo $ROS_PACKAGE_PATH 
/home/nvidia/workspace/catkin_ws/src:/opt/ros/melodic/share

Build jetson-inference

Clone and build the jetson-inference repo:

# git and cmake should be installed
sudo apt-get install git cmake

# clone the repo and submodules
cd ~/workspace
git clone -b onnx https://github.com/dusty-nv/jetson-inference
cd jetson-inference
git submodule update --init

# build from source
mkdir build
cd build
cmake ../
make

# install libraries
sudo make install

Build ros_deep_learning

Clone and build the ros_deep_learning repo:

# install dependencies
sudo apt-get install ros-melodic-vision-msgs ros-melodic-image-transport ros-melodic-image-publisher

# clone the repo
cd ~/workspace/catkin_ws/src
git clone https://github.com/dusty-nv/ros_deep_learning

# make ros_deep_learning
cd ../    # cd ~/workspace/catkin_ws
catkin_make

# confirm that the package can be found
$ rospack find ros_deep_learning
/home/nvidia/workspace/catkin_ws/src/ros_deep_learning

Build jetbot_ros

Clone and build the jetbot_ros repo:

# clone the repo
$ cd ~/workspace/catkin_ws/src
$ git clone https://github.com/dusty-nv/jetbot_ros

# build the package
$ cd ../    # cd ~/workspace/catkin_ws
$ catkin_make

# confirm that jetbot_ros package can be found
$ rospack find jetbot_ros
/home/nvidia/workspace/catkin_ws/src/jetbot_ros

Testing JetBot

Next, let's check that the different components of the robot are working under ROS.

First open a new terminal, and start roscore

$ roscore

Running the Motors

Open a new terminal, and start the jetbot_motors node:

$ rosrun jetbot_ros jetbot_motors.py

The jetbot_motors node will listen on the following topics:

  • /jetbot_motors/cmd_dir relative heading (degree [-180.0, 180.0], speed [-1.0, 1.0])
  • /jetbot_motors/cmd_raw raw L/R motor commands (speed [-1.0, 1.0], speed [-1.0, 1.0])
  • /jetbot_motors/cmd_str simple string commands (left/right/forward/backward/stop)

Note: as of 2/22/19, only cmd_str method is implemented. Other methods coming soon.

Test Motor Commands

Open a new terminal, and run some test commands:

$ rostopic pub /jetbot_motors/cmd_str std_msgs/String --once "forward"
$ rostopic pub /jetbot_motors/cmd_str std_msgs/String --once "backward"
$ rostopic pub /jetbot_motors/cmd_str std_msgs/String --once "left"
$ rostopic pub /jetbot_motors/cmd_str std_msgs/String --once "right"
$ rostopic pub /jetbot_motors/cmd_str std_msgs/String --once "stop"

(it is recommended to initially test with JetBot up on blocks, wheels not touching the ground)

Using the Debug OLED

If you have an SSD1306 debug OLED on your JetBot, you can run the jetbot_oled node to display system information and user-defined text:

$ rosrun jetbot_ros jetbot_oled.py

By default, jetbot_oled will refresh the display every second with the latest memory usage, disk space, and IP addresses.

The node will also listen on the /jetbot_oled/user_text topic to recieve string messages from the user that it will display:

rostopic pub /jetbot_oled/user_text std_msgs/String --once "HELLO!"

Using the Camera

To begin streaming the JetBot camera, start the jetbot_camera node:

$ rosrun jetbot_ros jetbot_camera

The video frames will be published to the jetbot_camera/raw topic as sensor_msgs::Image messages with BGR8 encoding.

JetBot Model for Gazebo Robotics Simulator

See the gazebo directory of the repo for instructions on loading the JetBot simulator model for Gazebo.

jetbot_ros's People

Contributors

dusty-nv avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.