lardemua / atlascar2 Goto Github PK
View Code? Open in Web Editor NEWCore packages for the LAR DEMUA Atlas Project
Core packages for the LAR DEMUA Atlas Project
Just a 2d lidar as we used in PSR, configured with the lms 151 parameters.
no passo 4 do readme:
Step 4: Plug the eduroam (UA) ethernet cable (the cable is outside the atlascar2) to the atlas computer (on the figure port).
Este passo só é possível quando o carro se encontra parado, sendo assim não é preciso ligar nada caso se vá conduzir?
@miguelriemoliveira
With the new configuration of AtlasCar2, a calibration is needed.
It is also needed for me to familarize with the calibration method.
@bernardomig when we try to push something by our computer, the author is you! Can you solve this, buddy?
Hug,
First simulation results:
+------------+-----------------+-----------+------------------+
| Collection | top_left_camera | top_lidar | top_right_camera |
+------------+-----------------+-----------+------------------+
| 000 | 0.1937 | 0.0049 | 0.2443 |
| 001 | 0.1867 | 0.0048 | 0.2047 |
| 002 | 0.1830 | 0.0046 | 0.1886 |
| 003 | 0.1804 | 0.0053 | 0.2124 |
| 004 | 0.1636 | 0.0050 | 0.1600 |
| 005 | 0.2486 | 0.0050 | 0.1581 |
| 006 | 0.2326 | 0.0046 | 0.1627 |
| 007 | 0.4148 | 0.0046 | 0.4296 |
| 008 | 0.4355 | 0.0046 | 0.3074 |
| 009 | 0.2729 | 0.0048 | --- |
| 010 | 0.4206 | 0.0045 | --- |
| 011 | --- | 0.0051 | 0.3825 |
| 012 | 0.1768 | 0.0046 | 0.1442 |
| 013 | 0.1360 | 0.0048 | 0.1578 |
| 014 | 0.1846 | 0.0046 | 0.1794 |
| 015 | 0.2026 | 0.0047 | 0.1597 |
| 016 | 0.1971 | 0.0043 | 0.1565 |
| 018 | 0.3541 | 0.0045 | 0.3532 |
| 019 | 0.3739 | 0.0047 | 0.2982 |
| 020 | 0.2562 | 0.0053 | 0.2839 |
| 021 | 0.3813 | 0.0051 | 0.3338 |
| 022 | 0.3376 | 0.0068 | 0.3398 |
| 023 | 0.2949 | 0.0048 | 0.4139 |
| 024 | 0.1978 | 0.0046 | 0.2292 |
| 025 | 0.1515 | 0.0044 | 0.1600 |
| 026 | 0.2358 | 0.0047 | 0.1996 |
| 027 | 0.4020 | 0.0046 | 0.3835 |
| 028 | 0.3408 | 0.0063 | 0.3797 |
| 029 | 0.2387 | 0.0046 | 0.2831 |
| 030 | 0.2175 | 0.0049 | --- |
| 031 | 0.2749 | 0.0055 | 0.3365 |
| 032 | 0.4193 | 0.0045 | --- |
| 033 | --- | 0.0059 | 0.2543 |
| 034 | --- | 0.0051 | 0.2650 |
| 035 | 0.3110 | 0.0059 | 0.2568 |
| 036 | 0.4029 | 0.0048 | 0.2867 |
| 037 | 0.3256 | 0.0044 | 0.4132 |
| 038 | 0.3970 | 0.0050 | 0.4199 |
| 039 | 0.4199 | 0.0044 | 0.3728 |
| 040 | 0.3100 | 0.0048 | 0.3628 |
| 041 | 0.2439 | 0.0048 | 0.2869 |
| 042 | 0.3175 | 0.0046 | 0.2787 |
| 043 | 0.2088 | 0.0045 | 0.2980 |
| 044 | 0.2854 | 0.0043 | 0.2545 |
| 045 | 0.2989 | 0.0042 | 0.2389 |
| 046 | 0.2206 | 0.0040 | --- |
| 047 | 0.3158 | 0.0046 | 0.4558 |
| 048 | 0.2730 | 0.0044 | 0.4310 |
| 049 | 0.4202 | 0.0044 | --- |
| 050 | 0.3532 | 0.0044 | 0.3497 |
| 051 | 0.4291 | 0.0041 | 0.5014 |
| 052 | 0.4611 | 0.0041 | 0.6177 |
| 053 | 0.5453 | 0.0034 | --- |
| Averages | 0.2969 | 0.0048 | 0.2954 |
+------------+-----------------+-----------+------------------+
With -nig 0.1 0,1
:
+------------+-----------------+-----------+------------------+
| Collection | top_left_camera | top_lidar | top_right_camera |
+------------+-----------------+-----------+------------------+
| 000 | 0.4363 | 0.0050 | 0.3862 |
| 001 | 0.4111 | 0.0048 | 0.3161 |
| 002 | 0.3127 | 0.0045 | 0.4280 |
| 003 | 0.9373 | 0.0054 | 1.1045 |
| 004 | 0.4321 | 0.0049 | 0.5989 |
| 005 | 0.3766 | 0.0049 | 0.3463 |
| 006 | 0.4978 | 0.0046 | 0.2823 |
| 007 | 0.5491 | 0.0047 | 0.5604 |
| 008 | 0.7537 | 0.0047 | 0.8065 |
| 009 | 0.8373 | 0.0049 | --- |
| 010 | 0.7970 | 0.0045 | --- |
| 011 | --- | 0.0052 | 0.4355 |
| 012 | 0.5426 | 0.0045 | 0.3056 |
| 013 | 0.3312 | 0.0048 | 0.2923 |
| 014 | 0.5595 | 0.0046 | 0.9921 |
| 015 | 0.3825 | 0.0047 | 0.4300 |
| 016 | 0.5029 | 0.0043 | 0.2730 |
| 018 | 0.6044 | 0.0046 | 0.4953 |
| 019 | 0.7116 | 0.0047 | 0.7428 |
| 020 | 0.9058 | 0.0053 | 0.9257 |
| 021 | 0.5208 | 0.0052 | 0.4836 |
| 022 | 0.4189 | 0.0067 | 0.6346 |
| 023 | 0.4747 | 0.0047 | 0.7903 |
| 024 | 0.4222 | 0.0046 | 0.7169 |
| 025 | 0.3550 | 0.0044 | 0.4545 |
| 026 | 0.4773 | 0.0048 | 0.3247 |
| 027 | 0.5258 | 0.0047 | 0.5350 |
| 028 | 0.4568 | 0.0062 | 0.6265 |
| 029 | 0.5782 | 0.0047 | 0.8493 |
| 030 | 1.0759 | 0.0050 | --- |
| 031 | 0.6776 | 0.0055 | 0.8193 |
| 032 | 0.7411 | 0.0044 | --- |
| 033 | --- | 0.0059 | 0.7030 |
| 034 | --- | 0.0050 | 0.9026 |
| 035 | 0.7889 | 0.0057 | 0.9841 |
| 036 | 0.7709 | 0.0048 | 0.7470 |
| 037 | 0.4891 | 0.0045 | 0.5267 |
| 038 | 0.5403 | 0.0049 | 0.5371 |
| 039 | 0.5375 | 0.0045 | 0.7909 |
| 040 | 0.6777 | 0.0048 | 1.0525 |
| 041 | 0.5495 | 0.0047 | 0.9792 |
| 042 | 0.6601 | 0.0046 | 1.0213 |
| 043 | 0.3932 | 0.0046 | 0.4955 |
| 044 | 0.5816 | 0.0043 | 0.3772 |
| 045 | 0.9315 | 0.0042 | 0.9141 |
| 046 | 1.2848 | 0.0041 | --- |
| 047 | 0.8885 | 0.0048 | 1.1998 |
| 048 | 0.7325 | 0.0045 | 1.0426 |
| 049 | 0.5398 | 0.0044 | --- |
| 050 | 0.4731 | 0.0044 | 0.5302 |
| 051 | 0.6480 | 0.0041 | 0.5870 |
| 052 | 0.5454 | 0.0041 | 0.9513 |
| 053 | 0.5795 | 0.0033 | --- |
| Averages | 0.6044 | 0.0048 | 0.6587 |
+------------+-----------------+-----------+------------------+
And here's a video!
Any idea what is worng? @bernardomig or @tmralmeida do you have this problem?
Hi @miguelriemoliveira ,
Doing tests with the two odometries I came across a problem. Since the robot drifts in gazebo, the odometry from the ackermann controller also drifts which in the long time gets a very different value from the odometry of the python code.
The values in question:
Python odometry
pose:
pose:
position:
x: 33.52832900205622
y: 0.0
z: 0.0
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 1.0
...
twist:
twist:
linear:
x: 0.0
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.0
ackermann controller odometry
pose:
pose:
position:
x: 33.518611540783525
y: -0.01387301223711008
z: 0.0
orientation:
x: 0.0
y: 0.0
z: -0.00020072386327466468
w: 0.9999999798549651
...
twist:
twist:
linear:
x: -1.9479035983982484e-05
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 4.910143044064103e-12
What should I do?
Improving the urdf model with the inertia and mass which is what some answers tell me to check.
https://answers.gazebosim.org//question/12952/robot-base-slowly-drifting/
ROBOTIS-GIT/turtlebot3_simulations#149
Should I add drift in the python code? Doesn't seem very possible but I know the real car also drifts
-Should I leave it? With this the values in the long term become imprecise...
In order to be able to test the python node that will test the odometry, we need to receive the speed and steering angle of the vehicle from the ackermann_steering_controller.cpp.
x axis should point to the front of atlas car, and y axis should point to left (driver view) so that z axis points to the top. Change the atlascar xacro file. @miguelriemoliveira
Hi @tmralmeida ,
sorry but I am going to change the names from
top_camera_right -> top_right_camera
and the same for the left camera.
Hi @vitoruapt and @miguelriemoliveira ,
This is a mystery! Me and @manuelgitgomes are using the Atlascar2 with the arduino and counting the pulses but, when we move back and forward, the counter only counts one direction , positive or negative, the problem is that the code already worked this morning! We are very confused!
The pulses of the oscilloscope before entering the arduino are correct has it can be seen here:
Here are the arduino pulses ( the X and Y axis can't be changed, sorry)
Forward values:
Independently of the movement, the orange value is always high when the blue is rising, making it indistinguishable when going forward or backward .
Olá @vitoruapt e @miguelriemoliveira ,
Dificuldades e progressos no teste do LIDAR 2D LMS151.
Hey,
In order to get better collections for calibration purposes, I'm trying to record the rosbag while changing the pattern's position and orientation in Rviz. For that matter, i see that in the interactive_pattern node gets the pattern pose relative to 'world' frame and sets the f feedback pose from Rviz to the same frame. I checked the TF tree and the frame in gazebo is 'odom':
I tried changing 'world' to 'odom' in the 3rd line in code-snippet but it failed
rospy.wait_for_service('/gazebo/get_model_state')
get_model_state_service = rospy.ServiceProxy('/gazebo/get_model_state', GetModelState)
pose_gazebo = get_model_state_service(model_name, 'world')
status_message:
"GetModelState: reference relative_entity_name not found, did you forget to scope\ \ the body by model name?"
Since there is no world frame in Rviz as well, i set the iterative marker.header.frame_id = 'odom' which works, when moving it in Rviz it also moves in gazebo but, as expected, the poses don't match:
Any idea on how to solve this?
FYI @vitoruapt .
This is to discuss the possibility of installing the odometer in the atlascar2. @manuelgitgomes and @Sarxell you said the screws are not compatible, right?
How difficult would it be to make a new interface?
To acquire the vehicle speed and steering angle, we need to connect to the CAN-USB to extract the CAN messages required. For that a python code is needed to acquire those messages and publish them in the Odometry_msgs.
The boot is very slow
systemd-analyze
gives the following output:
Startup finished in 2.079s (kernel) + 5min 17.273s (userspace) = 5min 19.353s graphical.target reached after 5min 17.236s in userspace
The userspace is taking too long to start, I will try to change this and put the steps I make.
Hi @manuelgitgomes and @Sarxell ,
In
https://github.com/lardemua/atlascar2#simulation
there are instruciton for installing and for running. These should be in separate sections.
Olá @vitoruapt e @miguelriemoliveira ,
Dificuldades e progressos no teste da câmara.
Hey,
I'm launching the collect_data from the atlacasr2_calibration package this way:
roslaunch atlascar2_calibration collect_data.launch output_folder:=$ATOM_DATASETS overwrite:=true bag_rate:=0.5 bag_start:=10
Everything is working except for the labeling:
Also the iterative markers are not showing which might be related. Any idea where the problem might be?
Since the CAN values are too imprecise, we can use an incremental encoder to give us better results.
Olá @vitoruapt e @miguelriemoliveira ,
Dificuldades e progressos no teste do LIDAR LD MRS.
The idea is first to search for an existing gazebo sensor of the sick lrms, but if we can't find one we can develop following this tutorial.
http://gazebosim.org/tutorials?cat=guided_i&tut=guided_i1
This should be low priority.
Any voluntary, @bernardomig @tmralmeida @danifpdra ?
can you specify this in the README please?
The sensor driver packages should be added to the folder atlas2-sensors
, as submodules, namely:
Hello @danifpdra
I have this error
RuntimeError: Multiple packages found with the same name "free_space_detection":
- free_space_detection
- road_detection/free_space_detection
Multiple packages found with the same name "lms1xx":
- LMS1xx
- road_detection/laser/RCPRG_laser_drivers/lms1xx
Multiple packages found with the same name "sick_ldmrs":
- road_detection/sick_ldmrs
- sick_ldmrs
It seems your "road_detection" has other packages which are already there. Can you remove those packages? Should I remove your road_detection from the catkin_ws of atlascar2?
In order to work with odometry in simulation, odom messages need to exist.
This package should contain all the urdf/xacro files to describe the car.
See this for inspiration.
I'm having trouble with connecting the novatel SPAN-IGM -A1
The antenna part is already done, but in terms of connecting the serial port the datasheet shows the following configuration:
And in the bringup we have the following code:
<launch>
<!-- Node novatel para comandos relativos a posição-->
<node name="novatel_position" pkg="nodelet" type="nodelet" args="standalone novatel_gps_driver/novatel_gps_nodelet">
<rosparam>
connection_type: serial
device: /dev/ttyUSB0
publish_novatel_positions: true
publish_novatelgnss_positions: true
publish_imu_messages: false
publish_nmea_messages: true
publish_default_messages: true
publish_diagnostics: true
publish_novatel_velocity: true
frame_id: /gps
</rosparam>
</node>
<!-- Node novatel para comandos da unidade inercial -->
<node name="novatel_imu" pkg="nodelet" type="nodelet" args="standalone novatel_gps_driver/novatel_gps_nodelet">
<rosparam>
connection_type: serial
device: /dev/ttyUSB6
publish_novatel_positions: false
publish_novatelgnss_positions: false
publish_imu_messages: true
publish_nmea_messages: false
publish_default_messages: false
publish_diagnostics: true
frame_id: /gps
</rosparam>
</node>
....
</launch>
I'm a little confused because there are a lot of cables and I can't understand clearly what I need to connect, here it exists two serial and in the scheme there is only one, does the professor know somebody that can help me with this?
Bringup should contain launch files required to start the system.
See this for inspiration.
Following #32 and lardemua/atom#425 we found that the topic names of the atlascar2 in the simulation and the live mode (with real sensors) is not consistent.
This should be altered so that the change from simulation to live is seamless to other nodes.
We could also try to add the 2D lidar sensors and even the four layer sick lrms sensor to the simulation so we have a complete simulation of the atlascar2, even though for now we won't calibrate those sensors. For 2d lidars gazebo already has models of the sensors (#33), but perhaps for the lrms we have to create our own (#34).
You can change the topic names of the live mode or the simulation. I liked that all the topics in the simulation were under the atlascar2 namespace. Perhaps you can articulate with @Sarxell because she did the live mode.
Also, the velodyne could perhaps be called something like top_lidar3d to be more consistent with the naming of the other sensors.
In order to combat inaccurate odometry values derivated from inaccurate odometry parameters, the recording of can messages inside a bag file is mandatory. For this, the package ros_canopen was used. The commands needed to use were:
sudo ip link set can0 up type can bitrate 500000
rosrun socketcan_bridge socketcan_to_topic_node
the first to set up can, and the second to publish can messages to ROS. The topic published is /received_messages
.
The script used to process the data is can_ros_msgs_to_ackermann.py
, in which the can messages are subscribed and the ackermann messages are published.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.