zhefan-xu / cerlab-uav-autonomy Goto Github PK
View Code? Open in Web Editor NEW[CMU] A Versatile and Modular Framework Designed for Autonomous Unmanned Aerial Vehicles [UAVs] (C++/ROS/PX4)
License: MIT License
[CMU] A Versatile and Modular Framework Designed for Autonomous Unmanned Aerial Vehicles [UAVs] (C++/ROS/PX4)
License: MIT License
/usr/bin/ld: /home/new_drone_ws/devel/lib/libonboard_detector.so: undefined reference to `cv::Mat::Mat()'
collect2: error: ld returned 1 exit status
为什么我用px4的模型仿真,使用移动导航,而且rviz在tf坐标系的很近周围会有彩色障碍物,但是在gazebo没有障碍物,直接给我整疑惑了
您好,您这个项目很好,最近也在学习,想请问一下,如果把mid360之类的激光雷达接入navigation,配置参数需要改什么吗,除了mapping_param.yaml中的sensor_input_mode改为1之外,谢谢
Hi,
For real flight, I set desired_velocity
to 5, desired_acceleration
to 1.5, But the actual velocity is probably less than 1m/s.
I feel it makes sense to modify this parameter in the simulation. Can you give me some advice?
Thank you.
1
Hi, thanks for your interesting work.
Now I am following your work, conducting real world experiments based on px4 and using vins_fusion for localization. Now I have some problems and would like to ask you about them.
To be on the safe side, I had to do offboard mode in other ways when actually flying, because using the source code would keep trying to get into offboard mode and I couldn't manually intervene when the uav went out of control, so I annotated the following part of the source code.
In flightBase.cpp
:
if (this->mavrosState_.mode != "OFFBOARD" && (ros::Time::now() - lastRequest > ros::Duration(5.0))){
if (this->setModeClient_.call(offboardMode) && offboardMode.response.mode_sent){
cout << "[AutoFlight]: Offboard mode enabled." << endl;
}
lastRequest = ros::Time::now();
} else {
if (!this->mavrosState_.armed && (ros::Time::now() - lastRequest > ros::Duration(5.0))){
if (this->armClient_.call(armCmd) && armCmd.response.success){
cout << "[AutoFlight]: Vehicle armed." << endl;
}
lastRequest = ros::Time::now();
}
}
And In dynamicNavigation.cpp
:
this->takeoff();
However, after the uav took off, I switched to offboard mode and designated the waypoint of the drone through rviz, but it seems that the program did not start the planning work, as the terminal did not output [AutoFlight]: Replan for new goal position
. The whole program seems to be stuck in an endless loop of the following program:
while (ros::ok() and not this->isReach(ps)){
currTime = ros::Time::now();
double t = (currTime - startTime).toSec();
if (t >= endTime){
psT = ps;
}
else{
double currYawTgt = yawCurr + (double) direction * t/endTime * yawDiffAbs;
geometry_msgs::Quaternion quatT = AutoFlight::quaternion_from_rpy(0, 0, currYawTgt);
psT.pose.orientation = quatT;
}
// this->updateTarget(psT);
target.position.x = psT.pose.position.x;
target.position.y = psT.pose.position.y;
target.position.z = psT.pose.position.z;
target.yaw = AutoFlight::rpy_from_quaternion(psT.pose.orientation);
this->updateTargetWithState(target);
// cout << "here" << endl;
ros::spinOnce();
r.sleep();
}
I have completed the simulation experiment based on px4, and I have changed the parameters in the configuration file for my real depth camera, compared to the simulation experiment, where do you think I might have gone wrong?
Another minor issue was when I started rviz, it sometimes seemed impossible to visualize the created dynamic map, but the /dynamic_map/inflated_voxel_map
topic kept getting messages.
Thank you for your precious time!
hi, thanks for your work very much! It's really fancy.
Now I want to run the project with px4 in simulation, and I follow the instruction in readme.md, but I meet two problem.
The first one is when I run the command roslaunch remote_control dynamic_navigation_rviz.launch
, the RobotModel item in Rviz has an error, I try to fix it by add following line in the px4_start.launch, I don't know whether I have done it :
<param name="robot_description" command="cat '$(find uav_simulator)/urdf/quadcopter.urdf'" />
<node pkg="tf" type="static_transform_publisher" name="base_link_to_map" args="0.0 0.0 0 0.0 0.0 0.0 /base_link /map 40" />
The second one is that nothing happened after I run the command roslaunch autonomous_flight dynamic_navigation.launch
and operate in Rviz. In another word, the navigation simulation cannot run successfully.
So could you help me? Thanks.
Hello! Thanks for your this great job!
I have read the paper named "Onboard dynamic-object detection and tracking for autonomous robot navigation with RGB-D camera".
I am confused in the section of "D. Data Association and Tracking".
The "Instead of directly using the previous obstacle’s feature, we apply the linear propagation to get the predicted obstacle’s position and replace the previous obstacle’s position with the predicted position in the feature vector" content mentioned in the original article.
We want to match the obstacles in the previous frame with the obstacles in the current frame. Why not directly use the features of the previous frame? Could you please explain it?
Best
yzy
I'm trying to build VINS-Fusion with GPU support using OpenCV 4.6.0, but I'm encountering some difficulties. Specifically, I'm unsure about the correct configuration and build steps required to enable GPU acceleration with OpenCV 4.6.0.
I've already followed the installation instructions for VINS-Fusion and have successfully built it without GPU support. However, I'm not sure how to incorporate OpenCV 4.6.0 with CUDA support into the build process to enable GPU acceleration.
Could someone provide guidance or pointers on how to correctly configure and build VINS-Fusion with GPU support using OpenCV 4.6.0?
Environment:
Hi, @Zhefan-Xu, thanks for your outstanding work.
I encountered another problem during real flight: it seems that the voxel update of dynamic obstacles is too slow, causing the UAV to easily pause in front of dynamic obstacles. And the trajectory seems to have not been re-planned. Could you give me some advice?
hello,
i am using the dynamic corridor. but i notice the collision of the walker part in only bottom. how can i check collision?
遇到障碍物如何让他往上飞
这个障碍物的移动可以设置成非匀速的吗?大致应怎么修改呢难度会不会很大
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.