Git Product home page Git Product logo

project-4-advanced-navigation's Introduction

HURBA: Advanced navigation tutorial IV.

Dependencies:

Project build instrctions:

  1. Clone this repo inside the src folder of a catkin workspace: git clone https://github.com/hungarianrobot/Project-4-Advanced-Navigation
  2. Build workspace: catkin_make
  3. Source environment: source devel/setup.bash
  4. We'll use a simulated IMU and the robot_pose_ekf package for sensor fusion. robot_pose_ekf will provide transformation between odom frame and base_link so we have to disable the default transformation of Gazebo! Edit /opt/ros/melodic/share/gazebo_ros/launch/empty_world.launch:
<!-- start gazebo server-->
  <node name="gazebo" pkg="gazebo_ros" type="gzserver" respawn="false" output="screen" 
    args="$(arg command_arg1) $(arg command_arg2) $(arg command_arg3) $(arg world_name)">    
  <remap from="tf" to="gazebo_tf"/> 
</node>

Note: we can use the roswtf package anytime to debug tf issues! E.g. in this case this is the output of roswtf:

Found 2 error(s).

ERROR TF re-parenting contention:
 * reparenting of [base_footprint] to [odom_combined] by [/robot_pose_ekf]
 * reparenting of [base_footprint] to [odom] by [/gazebo]

ERROR TF multiple authority contention:
 * node [/robot_pose_ekf] publishing transform [base_footprint] with parent [odom_combined] already published by node [/gazebo]
 * node [/gazebo] publishing transform [base_footprint] with parent [odom] already published by node [/robot_pose_ekf]

Test the simulation

  1. Start the Gazebo simulation: roslaunch hurba_advanced_navigation bringup.launch
  2. Start the teleop package: roslaunch hurba_advanced_navigation teleop.launch
  3. Drive the omnidirectional robot inside the simulated environment.

alt text

Navigation launch files

There are 3 different navigation configuration with 3 different launchfiles in the project. The difference among these navigation configurations is the used plugin for local and global planning.

navigation_basic.launch
navigation_advanced.launch
navigation_lattice.launch

Lattice global planner is and advanced planner for robots that handles non-circular footprints and nonholonomic constraints using motion primitives. Motion primitives are short, kinematically feasible motions which form the basis of movements that can be performed by the robot platform. Search-based planners can generate paths from start to goal configurations by combining a series of these motion primitives. The result is a smooth kinematically feasible path for the robot to follow. At the same time it's really resource consuming, so I don't suggest to use it in embedded environment. For more details, you can visit the following link: https://wiki.ros.org/Events/CoTeSys-ROS-School?action=AttachFile&do=get&target=robschooltutorial_oct10.pdf

Launch the navigation with any of the launchfiles from above:

  1. Start the Gazebo simulation: roslaunch hurba_advanced_navigation bringup.launch
  2. Start the navigation package: roslaunch hurba_advanced_navigation navigation_XXX.launch
  3. Send a navigation goal and move_base will drive the robot to the desired location.

Waypoint navigation

To send waypoints for the navigation we'll use RViz's 2D Pose Estimate button. Normally, this button sends an initial position for AMCL so we remapped it's functionality in the bringup.launch file in the following locations:

<node name="amcl" pkg="amcl" type="amcl" output="screen">
    <remap from="initialpose" to="initialpose_amcl"/>
...
<node pkg="follow_waypoints" type="follow_waypoints" name="follow_waypoints" output="screen">
    <param name="goal_frame_id" value="map"/>
    <remap from="initialpose" to="waypoint" />
</node>
<node name="rviz" pkg="rviz" type="rviz" respawn="false" args="-d $(find hurba_advanced_navigation)/rviz/navigation.rviz">
        <remap from="initialpose" to="waypoint" />
</node>

After remapping the signals new waypoints can be added with the 2D Pose Estimate button and small blue arrows will show the waypoints in RViz:

alt text

When the waypoints are set their execution can be started with the following command: rostopic pub /path_ready std_msgs/Empty -1

Setting goals from code

We can send waypoints for the navigation stack from code, too. To send a predefined list of waypoints we can simply launch the goals.launch file. This launcgfile will start 2 ROS nodes:

  • add_markers
  • nav_goals

Both nodes uses the same array of waypoints:

float waypoints[4][3] = { 
                          {-4.25,  0.4,  3.14}, 
                          {-4.25, -4.5,  0.0},
                          { 6.5,  -4.5,  1.57},
                          { 6.5,   3.0, -1.57}  
                        };

nav_goals node is sending waypoints for the navigation stack, after sending a goal it waits for the results from the navigation stack before it sends the next waypoint. add_markers node creates blue cubes for RViz to indicate the waypoints.

alt text

TF tree and ROS graph of nodes:

TF tree of the project:

alt text

rosgraph of teleoperation

alt text

rosgraph of navigation

alt text

Project structure:

 tree -L 3
.
├── documentation
│   ├── frames.png
│   ├── gazebo.png
│   ├── goals.png
│   ├── map.png
│   ├── robschooltutorial_oct10.pdf
│   ├── rosgraph_navigation.png
│   ├── rosgraph_teleop.png
│   └── waypoints.png
├── hurba_advanced_navigation
│   ├── CMakeLists.txt
│   ├── config
│   │   ├── base_local_planner_params.yaml
│   │   ├── costmap_common_params.yaml
│   │   ├── global_costmap_params.yaml
│   │   ├── global_planner_params.yaml
│   │   ├── lattice_global_planner_params.yaml
│   │   ├── local_costmap_params.yaml
│   │   ├── pr2.mprim
│   │   └── teb_local_planner_params.yaml
│   ├── launch
│   │   ├── bringup.launch
│   │   ├── goals.launch
│   │   ├── navigation_advanced.launch
│   │   ├── navigation_basic.launch
│   │   ├── navigation_lattice.launch
│   │   ├── robot_description.launch
│   │   ├── teleop.launch
│   │   └── world.launch
│   ├── maps
│   │   ├── map.pgm
│   │   └── map.yaml
│   ├── meshes
│   │   ├── chassis.dae
│   │   ├── chassis.SLDPRT
│   │   ├── chassis.STEP
│   │   ├── hokuyo.dae
│   │   ├── wheel.dae
│   │   ├── wheel.SLDPRT
│   │   └── wheel.STEP
│   ├── package.xml
│   ├── rviz
│   │   ├── basic_view.rviz
│   │   └── navigation.rviz
│   ├── src
│   │   ├── add_markers.cpp
│   │   └── nav_goals.cpp
│   ├── urdf
│   │   ├── hurba_mecanum.gazebo
│   │   └── hurba_mecanum.xacro
│   └── worlds
│       ├── basic_world.world
│       ├── building.model
│       ├── building.world
│       └── empty.world
└── README.md

project-4-advanced-navigation's People

Contributors

dudasdavid avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.