Git Product home page Git Product logo

Comments (7)

manuelgitgomes avatar manuelgitgomes commented on July 28, 2024 2

ok, sounds good, but the link is not working ...

My bad, it is now edited to work!

from atlascar2.

manuelgitgomes avatar manuelgitgomes commented on July 28, 2024

Hello @miguelriemoliveira!
While looking with @Sarxell through the current simulation, we found out that it works using position/velocity controllers for various joints. One example can be seen here:

left_steering_ctrlr:
joint: left_steering_joint
type: effort_controllers/JointPositionController
pid: {p: 4.0, i: 0.0, d: 1.0}

These controllers have messages of the type control_msgs/JointControllerState. An example can be seen here:

---
header: 
  seq: 796
  stamp: 
    secs: 2514
    nsecs: 998000000
  frame_id: ''
set_point: 0.0
process_value: -2.8604798222531258e-05
process_value_dot: 0.0016250648709078689
error: 2.8604798222531258e-05
time_step: 0.001
command: -0.005754577306568365
p: 4.0
i: 0.0
d: 1.0
i_clamp: 0.0
antiwindup: False

Using the process_value we can have an estimate of the position of that link. Using the position of the steering and traction links, we can implement a node that calculates the kinematics and can publish an odometry message. This might be complicated and time-consuming.

Another option found was to restructure the way the vehicle controls in gazebo. There exists a controller named ackermann_steering_controller, which is fairly more straight forward. An example can be seen here. An excerpt can be found here.

# Wheel & Steer
steer_drive_controller:
  type                          : "steer_drive_controller/SteerDriveController"
  rear_wheel                    : 'rear_wheel_joint'
  front_steer                   : 'front_steer_joint'
  #publish_rate                  : 100

  pose_covariance_diagonal      : [0.00001, 0.00001, 1000000000000.0, 1000000000000.0, 1000000000000.0, 0.001]
  twist_covariance_diagonal     : [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
  wheel_separation_h_multiplier : 1.7 # calibration parameter for angle odometory
  #wheel_radius_multiplier       : 1.0 # calibration parameter for linear odometory
  steer_pos_multiplier          : 1.5
  cmd_vel_timeout               : 4
  base_frame_id                 : base_link

  enable_odom_tf: true

It only uses one controller and it publishes the odom messages directly. However, the counterpoint is that the xacro model of the robot needs to be redone. It should be similar to something like this.

I believe Sara will share another two options. Any help from the professor regarding which option should he choose would be greatly appreciated.

Thank you and have a nice weekend.

from atlascar2.

Sarxell avatar Sarxell commented on July 28, 2024

Hi @miguelriemoliveira ,
While looking with @manuelgitgomes I found two possible solutions that can be seen in this link: https://answers.ros.org/question/222033/how-do-i-publish-gazebo-position-of-a-robot-model-on-odometry-topic/

The first one is the creation of a gazebo file and the addition of the following plugin:

<plugin name="p3d_base_controller" filename="libgazebo_ros_p3d.so">
  <alwaysOn>true</alwaysOn>
  <updateRate>50.0</updateRate>
  <bodyName>base_link</bodyName>
  <topicName>odom</topicName>
  <gaussianNoise>0.01</gaussianNoise>
  <frameName>world</frameName>
  <xyzOffsets>0 0 0</xyzOffsets>
  <rpyOffsets>0 0 0</rpyOffsets>
</plugin>

This gives the ground truth odometry. This option needs for us to create an .gazebo file which is also needed in order to add the sensors to the vehicle.

The second one is using the service `/gazebo/get_model_state provided by gazebo. I already tried this method with the following code:

#!/usr/bin/env python3
import rospy
from nav_msgs.msg import Odometry
from std_msgs.msg import Header
from gazebo_msgs.srv import GetModelState, GetModelStateRequest

rospy.init_node('odom_pub')

odom_pub = rospy.Publisher ('/atlascar2/odom', Odometry, queue_size=0.1 )

rospy.wait_for_service ('/gazebo/get_model_state')
get_model_srv = rospy.ServiceProxy('/gazebo/get_model_state', GetModelState)

odom=Odometry()
header = Header()
header.frame_id='/odom'

model = GetModelStateRequest()
model.model_name='atlascar2'
# model.model_name = rospy.get_param('model', 'atlascar2')

r = rospy.Rate(2)

while not rospy.is_shutdown():
    result = get_model_srv(model)

    odom.pose.pose = result.pose
    odom.twist.twist = result.twist

    header.stamp = rospy.Time.now()
    odom.header = header

    odom_pub.publish (odom)

    r.sleep()

Which I added in the bringup. Writing in the terminal `rostopic echo /atlascar2/odom gives me the following values:
Screenshot from 2022-04-02 18-46-13

Screenshot from 2022-04-02 19-04-55

With these values I can conclude that it must give the ground thruth (in both pictures the car started in x=30.0 y=30.0) since the values given by the topic are very close to the ones presented in gazebo.

With this me and @manuelgitgomes found 4 possible solutions and we need counseling to choose which one is the best.

Thank you and have a nice weekend!

from atlascar2.

manuelgitgomes avatar manuelgitgomes commented on July 28, 2024

Hello @miguelriemoliveira.
Using the first option mentioned, @Sarxell build the simulation as intended.
simulation
With this new controller, an odom frame is created and an odom message is sent, like this:

header: 
  seq: 2613
  stamp: 
    secs: 779
    nsecs: 922000000
  frame_id: "atlascar2/odom"
child_frame_id: "atlascar2/base_footprint"
pose: 
  pose: 
    position: 
      x: 5.795887926243192e-05
      y: -1.3413148689053361e-14
      z: 0.0
    orientation: 
      x: 0.0
      y: 0.0
      z: -1.232268675548781e-10
      w: 1.0
  covariance: [0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0]
twist: 
  twist: 
    linear: 
      x: 7.689264147403326e-06
      y: 0.0
      z: 0.0
    angular: 
      x: 0.0
      y: 0.0
      z: -1.648709397993257e-10
  covariance: [0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0]

The odometry is computed from the feedback from the simulated hardware and published. This allows us to see the robot's movement from the odom frame. This can be seen in this video.
The next step is to add sensors. Which sensors should I add? A lidar and 2 cameras?

from atlascar2.

miguelriemoliveira avatar miguelriemoliveira commented on July 28, 2024

The video looks great. Congratulations @Sarxell and @manuelgitgomes .

Remind me again which is the first method?

How can I try? Do we have instructions on the readme?

from atlascar2.

manuelgitgomes avatar manuelgitgomes commented on July 28, 2024

The first method is the built-in ackermann controller on ros controllers. This controller is named ackermann_steering_controller and it publishes odom messages automatically.
The instructions on how to try it are on the simulation section of the readme!

from atlascar2.

miguelriemoliveira avatar miguelriemoliveira commented on July 28, 2024

The first method is the built-in ackermann controller on ros controllers. This controller is named ackermann_steering_controller

ok, sounds good, but the link is not working ...

from atlascar2.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.