racecar's People
Forkers
tags07 qingweilau systempanda200 ns9 jetsonhacks danielalfonsetti rebparha merlinwu mubaarik kvalheimracing kangaboxer haitaoleo caomw xepost scottviteri jjdblast jdc08161063 aogrcs ncnynl russ76 roboticlemon mannchun chuengminchou racecarj wang970073247 easyfly007 darwinbeing nr-patel rocket-car jochipochi grisson raclab yapbenzet capitalze deeeplearning ganwy2017 sbow uincore mit-drl yangsu1986 hejunbok ivalab mrdadaguy 174high lyfan jiankangren objectbuild subodh-malgonde shreeyak zheng618 tianb03 huyongkang123 cblagden markbroerkens cathy-kim ros-fun hbzhang adityasriteja4u liqiaw notalltim autonomous-vehicle iamrajee vmi-robot sfxiang heibao-labs xpharry theokanning gt-racecar its-space chatmoon yzhz0322 fcylyn suprnrdy begger123 fadepeople scnsh billwsy igcxl taunusflieger ibrahimztrk icemile longervisionrobot poet-libai gothbea rtsyork kai1740 firatbozkaya qqsskk bobyoung17 zhiguangliugithub rongyu10 tao8687 mcx devanshdhrafani road-balance zeta1999 jwdinius 295autonomousvehicle ujjwalk liyaozoneracecar's Issues
Understanding ackermann_cmd_mux
Is there any documentation for the ackermann_cmd_mux
package? When I fire up the Gazebo simulation I can see a complex graph (rqt_graph). I understand that this package reads inputs from different sources (joystick, safety controller and autonomous algorithms like the wall follower) and decide which one to use based on the joystick inputs and the priorities. The priorities for the topics are in the following order:
- /vesc/ackermann_cmd_mux/input/teleop (highest priority)
- /vesc/ackermann_cmd_mux/input/safety
- /vesc/ackermann_cmd_mux/input/navigation (lowest priority)
My understanding is based on this presentation - Racecar Basics
My question
If I have to use an autonomous algorithm along with using a joystick then to which topic should the autonomous algorithm publish control commands to? Currently the joystick is publishing to /vesc/low_level/ackermann_cmd_mux/input/teleop
. I have tried /vesc/ackermann_cmd_mux/input/navigation
and /vesc/high_level/ackermann_cmd_mux/input/nav_0
, none worked.
Background
I am right now testing a "wall following algorithm" in the F1/10 simulator. My node reads the laser scans, computes the PID control values to make the car go parallel to the wall and publishes a AckermannDriveStamped message on the topic /vesc/ackermann_cmd_mux/input/teleop
. It works fine.
Now I want to be able to use the joystick, just in case if I have to course correct, while my wall following algorithm is running.
So I understand that my wall follower node cannot use the /vesc/ackermann_cmd_mux/input/teleop topic anymore since that would mean that the algorithm commands would have the highest priority. So I change the wall follower's topic to /vesc/ackermann_cmd_mux/input/navigation
. I fire up the joy node and I observe that I am only able to control the car with the joystick. When I switch over to the autonomous operation (by holding down right bumper RB button), I observe that the car is not controlled.
Then I change the topic to /vesc/high_level/ackermann_cmd_mux/input/nav_0
(based on sample solutions posted by MIT Racecar course TAs). Still nothing happens.
How to navigate with Racecar
I want to make real time navigation with racecar. When I run the navigation packages, it sends me the data of type cmd_vel Twist. How can I check the motors using this data?
Is this project still maintained?
No commits in over a year. Is the MIT team still maintaining this project?
How is the IMU direction defined?
Based on the imu installation, how are the axis of the imu defined? I feel that the imu x direction is not aligned with the car forward direction. When I get the roll/pitch/yaw, which number tells me if the car is going up-hill or down-hill?
enertion focbox
This device is very hard to find now. Any other VESC can replace this?
VESC configuration
Hi,
Looking at the racecar/config/racecar-v2/vesc.yaml
file I noticed the following lines:
# car wheelbase is about 25cm
wheelbase: .25
which I believe is incorrect, the wheelbase is the distance between the centers of the front and rear wheels which is about 32cm in the case of a Traxxas 1/10 Rally Car. The official specs page confirms this.
How to train the car on new course
Hi, is there any documents about how to collect training sample and train the car for a new course?
Need to center the steering
I built a Racecar J and could use teleop to drive it. I found, however, my car always turn right a little bit, even when I drive it straight. I think the steering is not centered. Is there anything I can tune with the BLDC tool? Or I have the tune the car mechanical parts?
How to calibrate the car to meet the specification
I am wondering if there is any specification on sensors, car odom, etc that we can use to evaluate the status of the car and calibrate the car. I built a MIT racecar but when I ran gmapping in my garage which has simple features, the slam algorithm could not generate a good map. In some cases, It could not match lidar points of the wall after the car turned, even after bundle adjustment or loop closure, whatever it used. The same happened with google cartographer. So I believe it must be because of the car.
how to collect training video?
Can the car be controlled by orignal 2-channel radio system? If not, how could you drive the car to collect training video for a new course?
IMU message orientation to world
I have a question about the IMU message from MIT racecar package. The orientation quaternion from the message is related to the world system or the car base_link?
The VESC throttle controls should probably be normalized to [-1, 1]
Currently, increasing the top speed of the robot requires changing the vesc.yaml
, joy.yaml
, and any other nodes that create throttle commands. Normalizing the speed controls would fix this.
Move steering angle clipping further upstream
According to the VESC config, the radian angles max out around +โ0.264. Currently, clipping downstream in the VESC makes Gazebo sad. The clipping should move upstream in the node architecture.
Acceleration ramping math needs to be checked/tuned
Because stopping right now is really slow. Probably because the RPMs are hitting some max.
Alternatively, apply different gain on accel and deccel.
How to compute the ackermann control
I have built a MIT-RACEcar and am using it for a fixed track race. I have my own particle filter which can localize the car. Once I know where I am, I can use pure pursuit to find the next target on the track. However, I do not know how to send out the ackermann control parameters. For example, I am currently at location (x1, y1), my near term target is (x2, y2). How can I compute the speed and steering angle from these two locations?
Since MIT-RACEcar has been used to race in the tunnel, it must have some code like this. Could anyone point me the location of the algorithm?
My MIT-Racecar can not start with LiPO battery
I used to use NiMH and everything works fine. To get more power, however, I replaced the NiMH battery with 4S LiPO. After that, I found that my car had difficulty to start. Please see the video here https://www.youtube.com/watch?v=D1XGa-vANRY I am not sure how to fix it and why I did not have the same problem with NiMH battery. Anyone uses LiPO battery and experienced the same problem?
Car can not turn
I have a very strange problem I hope you can help me with. My MIT-Racecar can go forward and backward but it can not turn. Neither the teleop or ros command can make it turn. When I tried to turn the front wheels by hand, it feels very stiff but turntable. I am wondering what's going on and how can I debug. Is servo bad? Is there anyway to check with the BLDC tool?
IMU code error
I think there is a buy in the IMU node.
I used RVIZ to visualizae the /imu/data message. When I rotate the car orientation 90 degree, the axis ( or Yaw angle) in RVIZ rotated 360 degree.
Anyone found the same?
Why do you use lidar, stereo, rgb-d cameras?
Why do you use lidar, stereo, rgb-d cameras? I'm curious for what reason you use lidar, stereo, rgb-d camera all of them.
run_camera launch parameter is never used?
Where did the ZED camera node from racecar-v1 go? Assuming this repo is maintained at all, does RACECAR still use a camera? Looks like both the ZED node and the structure.io camera are both missing from this codebase...
ROS node for Structure.io camera?
The hardware BOM includes a structure.io active stereo camera, but the actual rosinstall doesn't include any reference to a ROS node for controlling or publishing data from that camera. Is it no longer used on RACECAR?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.