Git Product home page Git Product logo

mynt-eye-vins-fusion-samples's Introduction

MYNT-EYE-VINS-FUSION

Prerequisites

1. Ubuntu and ROS and Hardware

Ubuntu 64-bit 16.04 or 18.04. ROS Kinetic or Melodic. ROS Installation

To complie with docker,we recommend that you should use more than 16G RAM, or ensure that the RAM and virtual memory space is greater than 16G.

2. Install docker

sudo apt-get update
sudo apt-get install \
    apt-transport-https \
    ca-certificates \
    curl \
    gnupg-agent \
    software-properties-common
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
sudo add-apt-repository \
   "deb [arch=amd64] https://download.docker.com/linux/ubuntu \
   $(lsb_release -cs) \
   stable"
sudo apt-get update
sudo apt-get install docker-ce docker-ce-cli containerd.io    

Then add your account to docker group by sudo usermod -aG docker $YOUR_USER_NAME. Relaunch the terminal or logout and re-login if you get Permission denied error.

3. Install MYNTEYE SDK

Skip this if already installed. S-SDK Installation, D-SDK Installation

4. Build docker image

cd path/to/this_repo/docker

make build

Run vins-fusion with mynteye-s

Stereo+imu fusion

cd path/to/MYNT-EYE-S-SDK
source wrappers/ros/devel/setup.bash
roslaunch mynt_eye_ros_wrapper vins_fusion.launch

Open another terminal

cd path/to/this_repo/docker
./run.sh mynteye-s/mynt_stereo_imu_config.yaml

Run vins-fusion with mynteye-d

mono+imu fusion

cd path/to/MYNT-EYE-D-SDK
source wrappers/ros/devel/setup.bash
roslaunch mynteye_wrapper_d vins_fusion.launch  stream_mode:=1 # stereo camera with 640x480

Open another terminal

cd path/to/this_repo/docker
./run.sh mynteye-d/mynt_mono_config.yaml

Stereo fusion

cd path/to/MYNT-EYE-D-SDK
source wrappers/ros/devel/setup.bash
roslaunch mynteye_wrapper_d vins_fusion.launch  stream_mode:=1 # stereo camera with 640x480

Open another terminal

cd path/to/this_repo/docker
./run.sh mynteye-d/mynt_stereo_config.yaml

Stereo+imu fusion

cd path/to/MYNT-EYE-D-SDK
source wrappers/ros/devel/setup.bash
roslaunch mynteye_wrapper_d vins_fusion.launch  stream_mode:=1 # stereo camera with 640x480

Open another terminal

cd path/to/this_repo/docker
./run.sh mynteye-d/mynt_stereo_imu_config.yaml

Run vins-fusion with mynteye-s2100

Stereo fusion

cd path/to/MYNT-EYE-S-SDK
source wrappers/ros/devel/setup.bash
roslaunch mynt_eye_ros_wrapper vins_fusion.launch

Open another terminal

cd path/to/this_repo/docker
./run.sh mynteye-s2100/mynt_stereo_config.yaml

Stereo+imu fusion

cd path/to/MYNT-EYE-S-SDK
source wrappers/ros/devel/setup.bash
roslaunch mynt_eye_ros_wrapper vins_fusion.launch

Open another terminal

cd path/to/this_repo/docker
./run.sh mynteye-s2100/mynt_stereo_imu_config.yaml

Run vins-fusion with mynteye-s2110

Stereo fusion

cd path/to/MYNT-EYE-S-SDK
source wrappers/ros/devel/setup.bash
roslaunch mynt_eye_ros_wrapper vins_fusion.launch

Open another terminal

cd path/to/this_repo/docker
./run.sh mynteye-s2110/mynt_stereo_config.yaml

Stereo+imu fusion

cd path/to/MYNT-EYE-S-SDK
source wrappers/ros/devel/setup.bash
roslaunch mynt_eye_ros_wrapper vins_fusion.launch

Open another terminal

cd path/to/this_repo/docker
./run.sh mynteye-s2110/mynt_stereo_imu_config.yaml

Tips

1.If you want to use our slam device case with loop fusion. you should run the run.sh like this:

./run.sh -l mynteye-s/mynt_stereo_imu_config.yaml

image

2.When you execute the above steps correctly, you can find 2 files generated in the target config dir.

device_params_left.yaml  (the left camera calib info)
device_params_right.yaml  (the right camera calib info)

If you wan't to calib the imu TF by yourself, you should be careful. By the way,you can check the data if you get the unsatisfactory result.

3.If you want to use other config.yml in this project,please remind the params in config.yml behind.

use_mynteye_adapter: 1  (1:use mynteye calib adapter auto; 0/null: do not use mynteye calib adapter)
mynteye_imu_srv: "d/s1/s2" (d: use d imu extri ros service; s1: use s1 imu extri ros service; s2: use s2 imu extri ros service; ""/null: do not use mynteye imu extri ros service)

VINS-Fusion

An optimization-based multi-sensor state estimator

VINS-Fusion is an optimization-based multi-sensor state estimator, which achieves accurate self-localization for autonomous applications (drones, cars, and AR/VR). VINS-Fusion is an extension of VINS-Mono, which supports multiple visual-inertial sensor types (mono camera + IMU, stereo cameras + IMU, even stereo cameras only). We also show a toy example of fusing VINS with GPS. Features:

  • multiple sensors support (stereo cameras / mono camera+IMU / stereo cameras+IMU)
  • online spatial calibration (transformation between camera and IMU)
  • online temporal calibration (time offset between camera and IMU)
  • visual loop closure

We are the top open-sourced stereo algorithm on KITTI Odometry Benchmark (12.Jan.2019).

Authors: Tong Qin, Shaozu Cao, Jie Pan, Peiliang Li, and Shaojie Shen from the Aerial Robotics Group, HKUST

Videos:

VINS

Related Paper: (paper is not exactly same with code)

  • Online Temporal Calibration for Monocular Visual-Inertial Systems, Tong Qin, Shaojie Shen, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS, 2018), best student paper award pdf

  • VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator, Tong Qin, Peiliang Li, Shaojie Shen, IEEE Transactions on Robotics pdf

If you use VINS-Fusion for your academic research, please cite our related papers. bib

1. Prerequisites

1.1 Ubuntu and ROS

Ubuntu 64-bit 16.04 or 18.04. ROS Kinetic or Melodic. ROS Installation

1.2. Ceres Solver

Follow Ceres Installation.

2. Build VINS-Fusion

Clone the repository and catkin_make:

    cd ~/catkin_ws/src
    git clone https://github.com/HKUST-Aerial-Robotics/VINS-Fusion.git
    cd ../
    catkin_make
    source ~/catkin_ws/devel/setup.bash

(if you fail in this step, try to find another computer with clean system or reinstall Ubuntu and ROS)

3. EuRoC Example

Download EuRoC MAV Dataset to YOUR_DATASET_FOLDER. Take MH_01 for example, you can run VINS-Fusion with three sensor types (monocular camera + IMU, stereo cameras + IMU and stereo cameras). Open four terminals, run vins odometry, visual loop closure(optional), rviz and play the bag file respectively. Green path is VIO odometry; red path is odometry under visual loop closure.

3.1 Monocualr camera + IMU

    roslaunch vins vins_rviz.launch
    rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_mono_imu_config.yaml 
    (optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_mono_imu_config.yaml 
    rosbag play YOUR_DATASET_FOLDER/MH_01_easy.bag

3.2 Stereo cameras + IMU

    roslaunch vins vins_rviz.launch
    rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_stereo_imu_config.yaml 
    (optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_stereo_imu_config.yaml 
    rosbag play YOUR_DATASET_FOLDER/MH_01_easy.bag

3.3 Stereo cameras

    roslaunch vins vins_rviz.launch
    rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_stereo_config.yaml 
    (optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_stereo_config.yaml 
    rosbag play YOUR_DATASET_FOLDER/MH_01_easy.bag

4. KITTI Example

4.1 KITTI Odometry (Stereo)

Download KITTI Odometry dataset to YOUR_DATASET_FOLDER. Take sequences 00 for example, Open two terminals, run vins and rviz respectively. (We evaluated odometry on KITTI benchmark without loop closure funtion)

    roslaunch vins vins_rviz.launch
    (optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion/config/kitti_odom/kitti_config00-02.yaml
    rosrun vins kitti_odom_test ~/catkin_ws/src/VINS-Fusion/config/kitti_odom/kitti_config00-02.yaml YOUR_DATASET_FOLDER/sequences/00/ 

4.2 KITTI GPS Fusion (Stereo + GPS)

Download KITTI raw dataset to YOUR_DATASET_FOLDER. Take 2011_10_03_drive_0027_synced for example. Open three terminals, run vins, global fusion and rviz respectively. Green path is VIO odometry; blue path is odometry under GPS global fusion.

    roslaunch vins vins_rviz.launch
    rosrun vins kitti_gps_test ~/catkin_ws/src/VINS-Fusion/config/kitti_raw/kitti_10_03_config.yaml YOUR_DATASET_FOLDER/2011_10_03_drive_0027_sync/ 
    rosrun global_fusion global_fusion_node

5. VINS-Fusion on car demonstration

Download car bag to YOUR_DATASET_FOLDER. Open four terminals, run vins odometry, visual loop closure(optional), rviz and play the bag file respectively. Green path is VIO odometry; red path is odometry under visual loop closure.

    roslaunch vins vins_rviz.launch
    rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion/config/vi_car/vi_car.yaml 
    (optional) rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion/config/vi_car/vi_car.yaml 
    rosbag play YOUR_DATASET_FOLDER/car.bag

6. Run with your devices

VIO is not only a software algorithm, it heavily relies on hardware quality. For beginners, we recommend you to run VIO with professional equipment, which contains global shutter cameras and hardware synchronization.

6.1 Configuration file

Write a config file for your device. You can take config files of EuRoC and KITTI as the example.

6.2 Camera calibration

VINS-Fusion support several camera models (pinhole, mei, equidistant). You can use camera model to calibrate your cameras. We put some example data under /camera_models/calibrationdata to tell you how to calibrate.

cd ~/catkin_ws/src/VINS-Fusion/camera_models/camera_calib_example/
rosrun camera_models Calibrations -w 12 -h 8 -s 80 -i calibrationdata --camera-model pinhole

7. Docker Support

To further facilitate the building process, we add docker in our code. Docker environment is like a sandbox, thus makes our code environment-independent. To run with docker, first make sure ros and docker are installed on your machine. Then add your account to docker group by sudo usermod -aG docker $YOUR_USER_NAME. Relaunch the terminal or logout and re-login if you get Permission denied error, type:

cd ~/catkin_ws/src/VINS-Fusion/docker
make build

Note that the docker building process may take a while depends on your network and machine. After VINS-Fusion successfully built, you can run vins estimator with script run.sh. Script run.sh can take several flags and arguments. Flag -k means KITTI, -l represents loop fusion, and -g stands for global fusion. You can get the usage details by ./run.sh -h. Here are some examples with this script:

# Euroc Monocualr camera + IMU
./run.sh ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_mono_imu_config.yaml

# Euroc Stereo cameras + IMU with loop fusion
./run.sh -l ~/catkin_ws/src/VINS-Fusion/config/euroc/euroc_mono_imu_config.yaml

# KITTI Odometry (Stereo)
./run.sh -k ~/catkin_ws/src/VINS-Fusion/config/kitti_odom/kitti_config00-02.yaml YOUR_DATASET_FOLDER/sequences/00/

# KITTI Odometry (Stereo) with loop fusion
./run.sh -kl ~/catkin_ws/src/VINS-Fusion/config/kitti_odom/kitti_config00-02.yaml YOUR_DATASET_FOLDER/sequences/00/

#  KITTI GPS Fusion (Stereo + GPS)
./run.sh -kg ~/catkin_ws/src/VINS-Fusion/config/kitti_raw/kitti_10_03_config.yaml YOUR_DATASET_FOLDER/2011_10_03_drive_0027_sync/

In Euroc cases, you need open another terminal and play your bag file. If you need modify the code, simply re-run ./run.sh with proper auguments after your changes.

8. Acknowledgements

We use ceres solver for non-linear optimization and DBoW2 for loop detection, a generic camera model and GeographicLib.

9. License

The source code is released under GPLv3 license.

We are still working on improving the code reliability. For any technical issues, please contact Tong Qin <qintonguavATgmail.com>.

For commercial inquiries, please contact Shaojie Shen <eeshaojieATust.hk>.

mynt-eye-vins-fusion-samples's People

Contributors

harjeb avatar peiliangli avatar pjrambo avatar qintonguav avatar shaozu avatar tinyslik avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mynt-eye-vins-fusion-samples's Issues

data stuck after launch success

hi,I fellowed the insdruction of https://zhuanlan.zhihu.com/p/62988961,where I have succeed in launching wrapper and vinsfusion .But after that,it stucks at the data input of vins-fusion.Any help would be appreciated!
crazydrone@MSI:~$ roslaunch mynt_eye_ros_wrapper vins_fusion.launch
... logging to /home/crazydrone/.ros/log/ed16ae0c-6b40-11e9-bc57-9cb6d068231f/roslaunch-MSI-23242.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://MSI:39823/

SUMMARY

PARAMETERS

  • /mynteye/depth/image_raw/disable_pub_plugins: ['image_transport...
  • /mynteye/disparity/image_norm/disable_pub_plugins: ['image_transport...
  • /mynteye/disparity/image_raw/disable_pub_plugins: ['image_transport...
  • /mynteye/left/image_mono/disable_pub_plugins: ['image_transport...
  • /mynteye/left/image_raw/disable_pub_plugins: ['image_transport...
  • /mynteye/left_rect/image_rect/disable_pub_plugins: ['image_transport...
  • /mynteye/mynteye_wrapper_node/base_frame_id: mynteye_link
  • /mynteye/mynteye_wrapper_node/depth_frame_id: mynteye_depth_frame
  • /mynteye/mynteye_wrapper_node/depth_topic: depth/image_raw
  • /mynteye/mynteye_wrapper_node/disparity_computing_method: 1
  • /mynteye/mynteye_wrapper_node/disparity_frame_id: mynteye_disparity...
  • /mynteye/mynteye_wrapper_node/disparity_norm_frame_id: mynteye_disparity...
  • /mynteye/mynteye_wrapper_node/disparity_norm_topic: disparity/image_norm
  • /mynteye/mynteye_wrapper_node/disparity_topic: disparity/image_raw
  • /mynteye/mynteye_wrapper_node/enable_depth: False
  • /mynteye/mynteye_wrapper_node/enable_disparity: False
  • /mynteye/mynteye_wrapper_node/enable_disparity_norm: False
  • /mynteye/mynteye_wrapper_node/enable_left_rect: False
  • /mynteye/mynteye_wrapper_node/enable_points: False
  • /mynteye/mynteye_wrapper_node/enable_right_rect: False
  • /mynteye/mynteye_wrapper_node/gravity: 9.8
  • /mynteye/mynteye_wrapper_node/imu_topic: imu/data_raw
  • /mynteye/mynteye_wrapper_node/left_frame_id: mynteye_left_frame
  • /mynteye/mynteye_wrapper_node/left_mono_topic: left/image_mono
  • /mynteye/mynteye_wrapper_node/left_rect_frame_id: mynteye_left_rect...
  • /mynteye/mynteye_wrapper_node/left_rect_topic: left_rect/image_rect
  • /mynteye/mynteye_wrapper_node/left_topic: left/image_raw
  • /mynteye/mynteye_wrapper_node/mesh_file: S1030-0315.obj
  • /mynteye/mynteye_wrapper_node/points_frame_id: mynteye_points_frame
  • /mynteye/mynteye_wrapper_node/points_topic: points/data_raw
  • /mynteye/mynteye_wrapper_node/right_frame_id: mynteye_right_frame
  • /mynteye/mynteye_wrapper_node/right_mono_topic: right/image_mono
  • /mynteye/mynteye_wrapper_node/right_rect_frame_id: mynteye_right_rec...
  • /mynteye/mynteye_wrapper_node/right_rect_topic: right_rect/image_...
  • /mynteye/mynteye_wrapper_node/right_topic: right/image_raw
  • /mynteye/mynteye_wrapper_node/sstandard/gain: -1
  • /mynteye/mynteye_wrapper_node/standard/accel_range: -1
  • /mynteye/mynteye_wrapper_node/standard/brightness: -1
  • /mynteye/mynteye_wrapper_node/standard/contrast: -1
  • /mynteye/mynteye_wrapper_node/standard/desired_brightness: -1
  • /mynteye/mynteye_wrapper_node/standard/exposure_mode: -1
  • /mynteye/mynteye_wrapper_node/standard/frame_rate: 20
  • /mynteye/mynteye_wrapper_node/standard/gyro_range: -1
  • /mynteye/mynteye_wrapper_node/standard/hdr_mode: -1
  • /mynteye/mynteye_wrapper_node/standard/imu_frequency: -1
  • /mynteye/mynteye_wrapper_node/standard/ir_control: 80
  • /mynteye/mynteye_wrapper_node/standard/max_exposure_time: -1
  • /mynteye/mynteye_wrapper_node/standard/max_gain: -1
  • /mynteye/mynteye_wrapper_node/standard/request_index: 0
  • /mynteye/mynteye_wrapper_node/standard2/accel_low_filter: -1
  • /mynteye/mynteye_wrapper_node/standard2/accel_range: -1
  • /mynteye/mynteye_wrapper_node/standard2/brightness: -1
  • /mynteye/mynteye_wrapper_node/standard2/desired_brightness: -1
  • /mynteye/mynteye_wrapper_node/standard2/exposure_mode: -1
  • /mynteye/mynteye_wrapper_node/standard2/gyro_low_filter: -1
  • /mynteye/mynteye_wrapper_node/standard2/gyro_range: -1
  • /mynteye/mynteye_wrapper_node/standard2/imu_process_mode: 2
  • /mynteye/mynteye_wrapper_node/standard2/ir_control: 80
  • /mynteye/mynteye_wrapper_node/standard2/max_exposure_time: -1
  • /mynteye/mynteye_wrapper_node/standard2/max_gain: -1
  • /mynteye/mynteye_wrapper_node/standard2/min_exposure_time: -1
  • /mynteye/mynteye_wrapper_node/standard2/request_index: 1
  • /mynteye/mynteye_wrapper_node/standard210a/accel_low_filter: -1
  • /mynteye/mynteye_wrapper_node/standard210a/accel_range: -1
  • /mynteye/mynteye_wrapper_node/standard210a/brightness: -1
  • /mynteye/mynteye_wrapper_node/standard210a/desired_brightness: -1
  • /mynteye/mynteye_wrapper_node/standard210a/exposure_mode: -1
  • /mynteye/mynteye_wrapper_node/standard210a/gyro_low_filter: -1
  • /mynteye/mynteye_wrapper_node/standard210a/gyro_range: -1
  • /mynteye/mynteye_wrapper_node/standard210a/iic_address_setting: -1
  • /mynteye/mynteye_wrapper_node/standard210a/max_exposure_time: -1
  • /mynteye/mynteye_wrapper_node/standard210a/max_gain: -1
  • /mynteye/mynteye_wrapper_node/standard210a/min_exposure_time: -1
  • /mynteye/mynteye_wrapper_node/temperature_frame_id: mynteye_temperatu...
  • /mynteye/mynteye_wrapper_node/temperature_topic: temperature/data_raw
  • /mynteye/right/image_mono/disable_pub_plugins: ['image_transport...
  • /mynteye/right/image_raw/disable_pub_plugins: ['image_transport...
  • /mynteye/right_rect/image_rect/disable_pub_plugins: ['image_transport...
  • /rosdistro: kinetic
  • /rosversion: 1.12.13

NODES
/mynteye/
mynteye_wrapper_node (mynt_eye_ros_wrapper/mynteye_wrapper_node)

auto-starting new master
process[master]: started with pid [23252]
ROS_MASTER_URI=http://localhost:11311

setting /run_id to ed16ae0c-6b40-11e9-bc57-9cb6d068231f
process[rosout-1]: started with pid [23265]
started core service [/rosout]
process[mynteye/mynteye_wrapper_node-2]: started with pid [23282]
[ INFO] [1556626222.412926559]: Initializing nodelet with 4 worker threads.
[ INFO] [1556626222.427231856]: Detecting MYNT EYE devices
W/uvc-v4l2.cc:246 xu_control_query failed error 5, Input/output error
W/uvc-v4l2.cc:554 xu_control_range query min failed
W/uvc-v4l2.cc:246 xu_control_query failed error 5, Input/output error
W/uvc-v4l2.cc:560 xu_control_range query max failed
W/uvc-v4l2.cc:246 xu_control_query failed error 5, Input/output error
W/uvc-v4l2.cc:566 xu_control_range query def failed
W/channels.cc:750 Get XuControlInfo of Option::IIC_ADDRESS_SETTING failed
[ INFO] [1556626223.497846458]: MYNT EYE devices:
[ INFO] [1556626223.497931274]: index: 0, name: MYNT-EYE-S2100, serial number: 031613380009072B
[ INFO] [1556626223.499098023]: Only one MYNT EYE device, select index: 0
I/synthetic.cc:59 camera calib model: kannala_brandt
[ INFO] [1556626223.817030384]: Option::BRIGHTNESS: -1
[ INFO] [1556626223.818381460]: Option::EXPOSURE_MODE: 0
[ INFO] [1556626223.819583433]: Option::MAX_GAIN: 8
[ INFO] [1556626223.820628347]: Option::MAX_EXPOSURE_TIME: 333
[ INFO] [1556626223.821741327]: Option::MIN_EXPOSURE_TIME: 0
[ INFO] [1556626223.822939602]: Option::DESIRED_BRIGHTNESS: 122
[ INFO] [1556626223.824004712]: Option::IR_CONTROL: 0
[ INFO] [1556626223.825045402]: Option::ACCELEROMETER_RANGE: 24
[ INFO] [1556626223.826153560]: Option::GYROSCOPE_RANGE: 1000
[ INFO] [1556626223.827222839]: Option::ACCELEROMETER_LOW_PASS_FILTER: 2
[ INFO] [1556626223.828621199]: Option::GYROSCOPE_LOW_PASS_FILTER: 64
[ INFO] [1556626223.849745580]: Advertized on topic left/image_raw
[ INFO] [1556626223.866225571]: Advertized on topic right/image_raw
[ INFO] [1556626223.883849202]: Advertized on topic left_rect/image_rect
[ INFO] [1556626223.901313665]: Advertized on topic right_rect/image_rect
[ INFO] [1556626223.918063064]: Advertized on topic disparity/image_raw
[ INFO] [1556626223.934494461]: Advertized on topic disparity/image_norm
[ INFO] [1556626223.951393494]: Advertized on topic depth/image_raw
[ INFO] [1556626223.952109280]: Advertized on topic points/data_raw
[ INFO] [1556626223.969010742]: Advertized on topic left/image_mono
[ INFO] [1556626223.985551614]: Advertized on topic right/image_mono
[ INFO] [1556626224.010323271]: Advertized on topic left_rect_mono
[ INFO] [1556626224.032982618]: Advertized on topic right_rect_mono
[ INFO] [1556626224.034536810]: Advertized on topic imu/data_raw
[ INFO] [1556626224.035248955]: Advertized on topic temperature/data_raw
[ INFO] [1556626224.042265724]: Advertized service get_info

~/catkin_workspace/3dmap_ws$ roslaunch vins mynteye-s-stereo.launch
... logging to /home/crazydrone/.ros/log/ed16ae0c-6b40-11e9-bc57-9cb6d068231f/roslaunch-MSI-27793.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://MSI:34073/

SUMMARY

PARAMETERS

  • /rosdistro: kinetic
  • /rosversion: 1.12.13
  • /vins_estimator/config_file: /home/crazydrone/...

NODES
/
rvizvisualisation (rviz/rviz)
vins_estimator (vins/vins_s_node)

ROS_MASTER_URI=http://localhost:11311

process[vins_estimator-1]: started with pid [27810]
[ INFO] [1556628275.247913427]: init begins
[ INFO] [1556628275.254431250]: load config_file: /home/crazydrone/catkin_workspace/3dmap_ws/src/MYNT-EYE-VINS-FUSION-Samples/vins_estimator/../config/mynteye-s/mynt_stereo_config.yaml

intrinsics:
{
"calib_model": "kannala_brandt",
"left": {
"width": 640,
"height": 400,
"coeffs": [ 0.5195596380168128, 0.37192647923901784, -0.8385899580200927, 0.3749881330049656, 201.49684831112688, 201.51504069699257, 316.19443185449023, 209.16229019313494 ]
},
"right": {
"width": 640,
"height": 400,
"coeffs": [ 0.5170296576317934, 0.364793349069549, -0.8233055446005737, 0.3670331677440168, 201.66390840610484, 201.64727724978655, 311.9091339675582, 196.30104268289054 ]
}
}
extrinsics:
{
"rotation": [ 0.9999888470957896, 0.0008352119425218027, 0.004648451897617464, -0.0008288927904443533, 0.9999987300748857, -0.0013611705836868596, -0.004649582860358961, 0.0013573023344170634, 0.9999882694859964 ],
"translation": [ -79.92830091219254, 0.18873111693487854, 0.5542568910008752 ]
}
USE_IMU: 0
result path /home/zhangs/vins_fusion//vio.csv
[ WARN] [1556628275.259322304]: Optimize extrinsic param around initial guess!
camera number 2
[ INFO] [1556628275.358865627]: Synchronized sensors, fix time offset: 0
[ INFO] [1556628275.358894545]: ROW: 480 COL: 752
no imu, fix extrinsic param; no time offset calibration
exitrinsic cam 0
-0.0030181 -0.999995 -0.00105709
0.999985 -0.00301316 -0.00464304
0.00463983 -0.00107109 0.999989
0.000915411 -0.0457025 0.0174575
exitrinsic cam 1
-0.00119192 -0.999968 -0.00786637
0.999932 -0.00128321 0.0116108
-0.0116205 -0.007852 0.999902
0.000162065 0.0761914 0.014611
set g 0 0 9.8
[ INFO] [1556628275.359041991]: reading paramerter of camera /home/crazydrone/catkin_workspace/3dmap_ws/src/MYNT-EYE-VINS-FUSION-Samples/vins_estimator/../config/mynteye-s/device_params_left.yaml
[ INFO] [1556628275.359214348]: reading paramerter of camera /home/crazydrone/catkin_workspace/3dmap_ws/src/MYNT-EYE-VINS-FUSION-Samples/vins_estimator/../config/mynteye-s/device_params_right.yaml
MULTIPLE_THREAD is 1
[ WARN] [1556628275.359321706]: waiting for image and imu...
process[rvizvisualisation-2]: started with pid [27868]
feature tracking not enough, please slowly move you device!

sudo usermod -a -G docker $USER

first, you should modify the fomat about this order like this in the readme.md
sudo usermod -a -G docker $USER

but sometims you need to add another order
sudo chmod 666 /var/run/docker.sock

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.