Git Product home page Git Product logo

simple_mmwave_control's Introduction

Simple mmWave Drone Control

Simulate mmWave radar based drone control in Gazebo with ROS2 and PX4

Derived from: https://github.com/nhma20/mmWave_ROS2_PX4_Gazebo

Prerequisites

Tested with:

  • Ubuntu 20.04.3 LTS
  • ROS2 Foxy
  • Gazebo 11.9.0
  • px4_ros_com 29th Nov
  • PX4 Autopilot v1.12.3

Launch all

https://docs.px4.io/master/en/ros/ros2_offboard_control.html https://github.com/PX4/px4_ros_com/blob/master/src/examples/offboard/offboard_control.cpp

  1. If offboard_control.cpp or other files have been edited, re-run install.sh script (add new files to script and CMakeLists.txt):

    cd ~/mmWave_ROS2_PX4_Gazebo/
    ( chmod +x ./install.sh )
    ./install.sh

    If same PX4 and px4_ros_com_ros2 roots:

    ./install.sh ~/PX4-Autopilot/ ~/px4_ros_com_ros2/
    
  2. Launch PX4 SITL:

     cd ~/PX4-Autopilot/ 
     make px4_sitl_rtps gazebo_iris__hca_full_pylon_setup

    Without Gazebo GUI:

     HEADLESS=1 make px4_sitl_rtps gazebo_iris__hca_full_pylon_setup

    Without drone following:

     PX4_NO_FOLLOW_MODE=1 make px4_sitl_rtps gazebo_iris__hca_full_pylon_setup

    After PX4 SITL fully launched, might need to manually start microRTPS client in same terminal:

     micrortps_client start -t UDP

    Will fail and return -1 if already running.

  3. Open QGroundControl

  4. In a new terminal start microRTPS agent and offboard control:

    source ~/px4_ros_com_ros2/install/setup.bash
    micrortps_agent start -t UDP & ros2 run px4_ros_com offboard_control 
  5. In another terminal, start the velocity vector advertiser, lidar to mmwave converter, and 3d to 2d projection nodes:

    source ~/px4_ros_com_ros2/install/setup.bash
    ros2 launch ~/mmWave_ROS2_PX4_Gazebo/launch/simulate_pointcloud_control_launch.py 
  6. Simulated drone in Gazebo should arm and takeoff. May need to restart vel_ctrl_vec_pub and offboard_control ros2 runs.

  7. Visualize simulated data in rviz2:

    rviz2 ~/mmWave_ROS2_PX4_Gazebo/3d_and_2d_pointcloud_rgb.rviz 

MISC

  1. Trajectory setpoint message: https://github.com/PX4/px4_msgs/blob/ros2/msg/TrajectorySetpoint.msg

  2. Disabled param: pxh> param set NAV_RCL_ACT 0

    NAV_RCL_ACT: curr: 2 -> new: 0

  3. Local positioning? https://github.com/PX4/px4_msgs/blob/ros2/msg/VehicleLocalPositionSetpoint.msg

  4. Add any new ROS2 files to ~/px4_ros_com_ros2/src/px4_ros_com/CMakeLists.txt

  5. Check if drone armed? https://github.com/PX4/px4_msgs/blob/ros2/msg/ActuatorArmed.msg

  6. libignition-common3 error (after software update?) - Copy existing file and rename to match missing file

  7. If gazebo does not open, try running gazebo --verbose to troubleshoot. killall gzserver should kill any gazebo instances. Restart PC if all else fails.

  8. inlude both iris.sdf and iris.sdf.jinja?

  9. Implemented laser scanner with Gazebo and ROS2 https://github.com/chapulina/dolly

  10. Make custom sensor plugin http://gazebosim.org/tutorials?cat=guided_i&tut=guided_i5

  11. In ~/px4_ros_com_ros2/src/px4_ros_com/CMakeLists.txt add sensor_msgs under ament_target_dependencies

  12. After running ./build_ros2_workspace restart all affected executables (micrortps_agent, offboard_control, vel_vec_ctrl_pub). Gazebo PX4 SITL can be left running.

  13. iris.sdf (or other models) can be edited to include sensors, like 2D lidar.

  14. Display simulated camera feed either with rviz2 or

source ~/px4_ros_com_ros2/install/setup.bash
ros2 run image_tools showimage image:=/cable_camera/image_raw
  1. Add new worlds/models to ~/PX4-Autopilot/platforms/posix/cmake/sitl_target.cmake (Oscar's worlds/models from https://gitlab.drones4energy.dk/obs/Drones4Energy_SDU_Only_code/-/tree/iROS2021/Tools/simulationAssets)
  2. See local packages, and msgs, with: ros2 interface packages and e.g. ros2 interface package px4_msgs
  3. Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) (http://sdformat.org/spec?ver=1.7&elem=sensor#lens_intrinsics)
  4. Drone spawn coordinates set in ~/PX4-Autopilot/Tools/sitl_run.sh ?
  5. *** No rule to make target '/opt/ros/foxy/lib/libfastrtps.so.2.0.2', needed by 'libpx4_msgs__rosidl_typesupport_fastrtps_cpp.so'. Stop. Fixed by renaming closest libfastrtps.so.x.y.z to libfastrtps.so.2.0.2.
  6. Dependency errors with PX4, like ninja: error: '/usr/lib/x86_64-linux-gnu/libsdformat9.so.9.6.1', needed by 'libmav_msgs.so', missing and no known rule to make it may be solved by a PX4 reinstall (remember worlds, models, cmake files etc. must be also be reinstalled into new PX4).
  7. If drone enters failsafe when starting offboard_control, param set COM_RCL_EXCEPT 4 in the PX4 console may solve this. Else, try manually publish few setpoints to fmu/manual_control_setpoint/in and then start offboard mode.
  8. Showing videos in readme: Just drag and drop your image/video from your local pc to github readme in editable mode.
  9. If gradle not working, might have to downgrade Java (JDK) to 11: https://askubuntu.com/questions/1133216/downgrading-java-11-to-java-8
  10. Make sure yaw messages are received, otherwise drone will controll in global coordinate frame

TODO

  1. ๐ŸŸข Install tools
  2. ๐ŸŸข Figure out how to control drone via offboard_control.cpp
  3. ๐ŸŸข Make ROS2 advertiser that generates control input for offboard_control.cpp for more advanced control
  4. ๐ŸŸข Figure out how to use simulated depth sensors
  5. ๐ŸŸข Implement depth data into ROS2 advertiser for even more advanced control
  6. ๐ŸŸข Control drone towards overhead cable

Alt text

  1. ๐ŸŸก More tightly integrate with PX4 to optimize control based on e.g. drone state
    • get pose of drone to mitigate sideways motion when rotated around x or y.
    • use GPS positioning to counteract drift
  2. ๐ŸŸข Use drone mounted simulated camera to get images of overhead cable
  3. ๐ŸŸข Visualize depth data in camera feed

Alt text

  1. ๐ŸŸข Investigate occasional drone control loss
  2. ๐ŸŸข Make module that turns 2d lidar data into noisy pointcloud to prepare for mmwave integration
  3. ๐ŸŸก Tracking of points in pointcloud (kalman?)
  4. ๐ŸŸก Implement cable detection AI to filter depth data and align drone yaw wrt. cable

Alt text

px4_ros2_launch_compressed_v2.mp4

simple_mmwave_control's People

Contributors

nhma20 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.