Git Product home page Git Product logo

mynt-eye-okvis-sample's Introduction

MYNTEYE OKVIS

At first ,judge if your device type is mynteye-d or mynteye-s,then follow the following installation process: Install with MYNT-EYE-S-SDK / Install with MYNT-EYE-D-SDK.

Install with MYNT-EYE-S-SDK

  1. Download and install MYNT-EYE-S-SDK.
  2. Install dependencies and build MYNT-EYE-OKVIS-Sample follow the procedure of the Original OKVIS here.
  3. Update camera parameters to here.
  4. run okvis using mynteye camaera.

Install MYNTEYE OKVIS

First install dependencies based on the original OKVIS,and the follow the type:

git clone -b mynteye https://github.com/slightech/MYNT-EYE-OKVIS-Sample.git
cd MYNT-EYE-OKVIS-Sample/
mkdir build && cd build
cmake ..
make

Get camera calibration parameters

Through the GetIntrinsics() and GetExtrinsics() function of the MYNT-EYE-S-SDK API, you can get the camaera calibration parameters of the currently open device,follow the steps

cd MYNT-EYE-S-SDK
./samples/_output/bin/tutorials/get_img_params

After running the above type, pinhole's distortion_parameters and camera parameters is obtained , and then update to here according to following format. It should be noted that only first four parameters of coeffs need to be filled in the distortion_coefficients.

distortion_coefficients: [coeffs]
focal_length: [fx, fy]
principal_point: [cx, cy]
distortion_type: radialtangential

Run MYNTEYE OKVIS

Go to MYNT-EYE-OKVIS-Sample/build folder and Run the application okvis_app_mynteye_s:

cd MYNT-EYE-OKVIS-Sample/bin
./okvis_app_mynteye_s ../config/config_mynteye_s.yaml

HEALTH WARNING: calibration

If you would like to run the software/library on your own hardware setup, be aware that good results (or results at all) may only be obtained with appropriate calibration of the

  • camera intrinsics,
  • camera extrinsics (poses relative to the IMU),
  • knowledge about the IMU noise parameters,
  • and ACCURATE TIME SYNCHRONISATION OF ALL SENSORS.

To perform a calibration yourself, we recommend the following:

Install with MYNT-EYE-D-SDK

  1. Download and install MYNT-EYE-D-SDK.
  2. Install dependencies and build MYNT-EYE-OKVIS-Sample follow the procedure of the Original OKVIS .
  3. Update camera parameters to here.
  4. run okvis using mynteye depth camaera.

Install MYNTEYE OKVIS

First install dependencies based on the original OKVIS,and the follow the type:

git clone -b mynteye https://github.com/slightech/MYNT-EYE-OKVIS-Sample.git
cd MYNT-EYE-OKVIS-Sample/
mkdir build && cd build
cmake ..
make

Get calibration parameters

Through the MYNT-EYE-D-SDK API, you can get the camaera and IMU calibration parameters of the currently open device,follow the steps

cd MYNT-EYE-D-SDK
./samples/_output/bin/get_img_params
./samples/_output/bin/get_imu_params

After running the above type, pinhole's distortion_parameters and camera parameters is obtained , and then update to here according to following format. It should be noted that only first four parameters of coeffs need to be filled in the distortion_coefficients.

distortion_coefficients: [coeffs]
focal_length: [fx, fy]
principal_point: [cx, cy]
distortion_type: radialtangential

Run MYNTEYE OKVIS_ROS

Run camera mynteye_wrapper_d

cd MYNT-EYE-D-SDK
source wrappers/ros/devel/setup.bash
roslaunch mynteye_wrapper_d mynteye.launch

Run MYNT-EYE-OKVIS-Sample open another terminal and follow the steps.

cd MYNT-EYE-OKVIS-Sample/build
source devel/setup.bash
roslaunch okvis_ros mynteye_d.launch

And use rviz to display

cd ~/catkin_okvis/src/MYNT-EYE-OKVIS-Sample/config
rosrun rviz rviz -d rviz.rviz

HEALTH WARNING: calibration

If you would like to run the software/library on your own hardware setup, be aware that good results (or results at all) may only be obtained with appropriate calibration of the

camera intrinsics,
camera extrinsics (poses relative to the IMU),
knowledge about the IMU noise parameters,
and ACCURATE TIME SYNCHRONISATION OF ALL SENSORS.

To perform a calibration yourself, we recommend the following:

Get Kalibr by following the instructions here https://github.com/ethz-asl/kalibr/wiki/installation . If you decide to build from source and you run ROS indigo checkout pull request 3:

  git fetch origin pull/3/head:request3
  git checkout request3

Follow https://github.com/ethz-asl/kalibr/wiki/multiple-camera-calibration to calibrate intrinsic and extrinsic parameters of the cameras. If you receive an error message that the tool was unable to make an initial guess on focal length, make sure that your recorded dataset contains frames that have the whole calibration target in view.

Follow https://github.com/ethz-asl/kalibr/wiki/camera-imu-calibration to get estimates for the spatial parameters of the cameras with respect to the IMU.

README {#mainpage}

Welcome to OKVIS: Open Keyframe-based Visual-Inertial SLAM.

This is the Author's implementation of the [1] and [3] with more results in [2].

[1] Stefan Leutenegger, Simon Lynen, Michael Bosse, Roland Siegwart and Paul Timothy Furgale. Keyframe-based visual–inertial odometry using nonlinear optimization. The International Journal of Robotics Research, 2015.

[2] Stefan Leutenegger. Unmanned Solar Airplanes: Design and Algorithms for Efficient and Robust Autonomous Operation. Doctoral dissertation, 2014.

[3] Stefan Leutenegger, Paul Timothy Furgale, Vincent Rabaud, Margarita Chli, Kurt Konolige, Roland Siegwart. Keyframe-Based Visual-Inertial SLAM using Nonlinear Optimization. In Proceedings of Robotics: Science and Systems, 2013.

Note that the codebase that you are provided here is free of charge and without any warranty. This is bleeding edge research software.

Also note that the quaternion standard has been adapted to match Eigen/ROS, thus some related mathematical description in [1,2,3] will not match the implementation here.

If you publish work that relates to this software, please cite at least [1].

How do I get set up?

This is a catkin package that wraps the pure CMake project.

You will need to install the following dependencies,

  • ROS (currently supported: hydro, indigo and jade). Read the instructions in http://wiki.ros.org/indigo/Installation/Ubuntu. You will need the additional package pcl-ros as (assuming indigo)

      sudo apt-get install ros-indigo-pcl-ros
    
  • google-glog + gflags,

      sudo apt-get install libgoogle-glog-dev
    
  • The following should get installed through ROS anyway:

      sudo apt-get install libatlas-base-dev libeigen3-dev libsuitesparse-dev
      sudo apt-get install libopencv-dev libboost-dev libboost-filesystem-dev
    
  • Optional: use the the package with the Skybotix VI sensor. Note that this requires a system install, not just as ROS package. Also note that Skybotix OSX support is experimental (checkout the feature/osx branch).

      git clone https://github.com/ethz-asl/libvisensor.git
      cd libvisensor
      ./install_libvisensor.sh
    

then download and expand the archive into your catkin workspace:

wget https://www.doc.ic.ac.uk/~sleutene/software/okvis_ros-1.1.3.zip
unzip okvis_ros-1.1.3.zip && rm okvis_ros-1.1.3.zip

Or, clone the repository from github into your catkin workspace:

git clone --recursive [email protected]:ethz-asl/okvis_ros.git

or

git clone --recursive https://github.com/ethz-asl/okvis_ros.git

Building the project

From the catkin workspace root, type

catkin_make

You will find a demo application in okvis_apps. It can process datasets in the ASL/ETH format.

In order to run a minimal working example, follow the steps below:

  1. Download a dataset of your choice from http://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets. Assuming you downloaded MH_01_easy/. You will find a corresponding calibration / estimator configuration in the okvis/config folder.

  2. Run the app as

     ./okvis_apps path/to/okvis_ros/okvis/config/config_fpga_p2_euroc.yaml path/to/MH_01_easy/
    

You can also run a dataset processing ros node that will publish topics that can be visualized with rviz

rosrun okvis_ros okvis_node_synchronous path/to/okvis_ros/okvis/config/config_fpga_p2_euroc.yaml path/to/MH_01_easy/

Use the rviz.rviz configuration in the okvis_ros/config/ directory to get the pose / landmark display.

If you want to run the live application connecting to a sensor, use the okvis_node application (modify the launch file launch/okvis_node.launch).

Outputs and frames

In terms of coordinate frames and notation,

  • W denotes the OKVIS World frame (z up),
  • C_i denotes the i-th camera frame,
  • S denotes the IMU sensor frame,
  • B denotes a (user-specified) body frame.

The output of the okvis library is the pose T_WS as a position r_WS and quaternion q_WS, followed by the velocity in World frame v_W and gyro biases (b_g) as well as accelerometer biases (b_a). See the example application to get an idea on how to use the estimator and its outputs (callbacks returning states).

The okvis_node ROS application will publish a configurable state -- see just below.

Configuration files

The okvis/config folder contains example configuration files. Please read the documentation of the individual parameters in the yaml file carefully. You have various options to trade-off accuracy and computational expense as well as to enable online calibration. You also have various options concerning the things that will get published -- in particular weather or not landmarks should be published (may be important to turn off for on-bard operation). Moreover, you can specify how the body frame is specified (T_BS) or define a custom World frame. In other words, the final pose published will be T_Wc_B = T_Wc_W * T_WS * T_BS^(-1) . You have the option to express the velocity as well as the rotation rates in either B, S, or Wc.

HEALTH WARNING: calibration

If you would like to run the software/library on your own hardware setup, be aware that good results (or results at all) may only be obtained with appropriate calibration of the

  • camera intrinsics,
  • camera extrinsics (poses relative to the IMU),
  • knowledge about the IMU noise parameters,
  • and ACCURATE TIME SYNCHRONISATION OF ALL SENSORS.

To perform a calibration yourself, we recommend the following:

Contribution guidelines

Support

The developpers will be happy to assist you or to consider bug reports / feature requests. But questions that can be answered reading this document will be ignored. Please contact [email protected].

mynt-eye-okvis-sample's People

Contributors

sleutenegger avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

mynt-eye-okvis-sample's Issues

make error

Scanning dependencies of target okvis_frontend
[ 76%] Building CXX object okvis_frontend/CMakeFiles/okvis_frontend.dir/src/stereo_triangulation.cpp.o
[ 77%] Building CXX object okvis_frontend/CMakeFiles/okvis_frontend.dir/src/Frontend.cpp.o
[ 78%] Building CXX object okvis_frontend/CMakeFiles/okvis_frontend.dir/src/VioKeyframeWindowMatchingAlgorithm.cpp.o
[ 80%] Building CXX object okvis_frontend/CMakeFiles/okvis_frontend.dir/src/ProbabilisticStereoTriangulator.cpp.o
[ 81%] Building CXX object okvis_frontend/CMakeFiles/okvis_frontend.dir/src/FrameNoncentralAbsoluteAdapter.cpp.o
In file included from /usr/include/eigen3/Eigen/LU:29:0,
from /usr/include/eigen3/Eigen/Dense:2,
from /home/rosfun/code/MYNT-EYE-OKVIS-Sample/build/include/ceres/internal/numeric_diff.h:39,
from /home/rosfun/code/MYNT-EYE-OKVIS-Sample/build/include/ceres/dynamic_numeric_diff_cost_function.h:70,
from /home/rosfun/code/MYNT-EYE-OKVIS-Sample/build/include/ceres/ceres.h:47,
from /home/rosfun/code/MYNT-EYE-OKVIS-Sample/okvis_ceres/include/okvis/Estimator.hpp:48,
from /home/rosfun/code/MYNT-EYE-OKVIS-Sample/okvis_frontend/include/okvis/VioKeyframeWindowMatchingAlgorithm.hpp:48,
from /home/rosfun/code/MYNT-EYE-OKVIS-Sample/okvis_frontend/src/VioKeyframeWindowMatchingAlgorithm.cpp:41:
/usr/include/eigen3/Eigen/src/LU/PartialPivLU.h: In instantiation of ‘class Eigen::MatrixBase<Eigen::Block<const Eigen::Map<Eigen::Matrix<double, 3, 1> >, 3, 1, true> >’:
/usr/include/eigen3/Eigen/src/Core/MapBase.h:27:34: required from ‘class Eigen::MapBase<Eigen::Block<const Eigen::Map<Eigen::Matrix<double, 3, 1> >, 3, 1, true>, 0>’
/usr/include/eigen3/Eigen/src/Core/Block.h:333:7: required from ‘class Eigen::internal::BlockImpl_dense<const Eigen::Map<Eigen::Matrix<double, 3, 1> >, 3, 1, true, true>’
/usr/include/eigen3/Eigen/src/Core/Block.h:155:7: required from ‘class Eigen::BlockImpl<const Eigen::Map<Eigen::Matrix<double, 3, 1> >, 3, 1, true, Eigen::Dense>’
/usr/include/eigen3/Eigen/src/Core/Block.h:104:81: required from ‘class Eigen::Block<const Eigen::Map<Eigen::Matrix<double, 3, 1> >, 3, 1, true>’
/usr/include/eigen3/Eigen/src/Core/ProductEvaluators.h:516:76: required from ‘const CoeffReturnType Eigen::internal::product_evaluator<Eigen::Product<Lhs, Rhs, 1>, ProductTag, Eigen::DenseShape, Eigen::DenseShape>::coeff(Eigen::Index) const [with Lhs = Eigen::Transpose<const Eigen::Matrix<double, 3, 3> >; Rhs = Eigen::Map<Eigen::Matrix<double, 3, 1> >; int ProductTag = 3; typename Eigen::internal::traits<typename Eigen::Product<Lhs, Rhs, 1>::Rhs>::Scalar = double; typename Eigen::internal::traits<typename Eigen::Product<Lhs, Rhs, 1>::Lhs>::Scalar = double; Eigen::internal::product_evaluator<Eigen::Product<Lhs, Rhs, 1>, ProductTag, Eigen::DenseShape, Eigen::DenseShape>::CoeffReturnType = double; Eigen::Index = long int]’
/usr/include/eigen3/Eigen/src/Core/AssignEvaluator.h:577:5: [ skipping 13 instantiation contexts, use -ftemplate-backtrace-limit=0 to disable ]
/usr/include/eigen3/Eigen/src/Core/AssignEvaluator.h:790:31: required from ‘static void Eigen::internal::Assignment<DstXprType, SrcXprType, Functor, Eigen::internal::Dense2Dense, Scalar>::run(DstXprType&, const SrcXprType&, const Functor&) [with DstXprType = Eigen::Matrix<double, 3, 1>; SrcXprType = Eigen::CwiseUnaryOp<Eigen::internal::scalar_opposite_op, const Eigen::Product<Eigen::Transpose<const Eigen::Matrix<double, 3, 3> >, Eigen::Map<Eigen::Matrix<double, 3, 1> >, 0> >; Functor = Eigen::internal::assign_op; Scalar = double]’
/usr/include/eigen3/Eigen/src/Core/AssignEvaluator.h:747:49: required from ‘void Eigen::internal::call_assignment_no_alias(Dst&, const Src&, const Func&) [with Dst = Eigen::Matrix<double, 3, 1>; Src = Eigen::CwiseUnaryOp<Eigen::internal::scalar_opposite_op, const Eigen::Product<Eigen::Transpose<const Eigen::Matrix<double, 3, 3> >, Eigen::Map<Eigen::Matrix<double, 3, 1> >, 0> >; Func = Eigen::internal::assign_op]’
/usr/include/eigen3/Eigen/src/Core/PlainObjectBase.h:700:41: required from ‘Derived& Eigen::PlainObjectBase::_set_noalias(const Eigen::DenseBase&) [with OtherDerived = Eigen::CwiseUnaryOp<Eigen::internal::scalar_opposite_op, const Eigen::Product<Eigen::Transpose<const Eigen::Matrix<double, 3, 3> >, Eigen::Map<Eigen::Matrix<double, 3, 1> >, 0> >; Derived = Eigen::Matrix<double, 3, 1>]’
/usr/include/eigen3/Eigen/src/Core/PlainObjectBase.h:510:19: required from ‘Eigen::PlainObjectBase::PlainObjectBase(const Eigen::DenseBase&) [with OtherDerived = Eigen::CwiseUnaryOp<Eigen::internal::scalar_opposite_op, const Eigen::Product<Eigen::Transpose<const Eigen::Matrix<double, 3, 3> >, Eigen::Map<Eigen::Matrix<double, 3, 1> >, 0> >; Derived = Eigen::Matrix<double, 3, 1>]’
/usr/include/eigen3/Eigen/src/Core/Matrix.h:379:29: required from ‘Eigen::Matrix<_Scalar, _Rows, _Cols, _Options, _MaxRows, _MaxCols>::Matrix(const Eigen::EigenBase&) [with OtherDerived = Eigen::CwiseUnaryOp<Eigen::internal::scalar_opposite_op, const Eigen::Product<Eigen::Transpose<const Eigen::Matrix<double, 3, 3> >, Eigen::Map<Eigen::Matrix<double, 3, 1> >, 0> >; _Scalar = double; int _Rows = 3; int _Cols = 1; int _Options = 0; int _MaxRows = 3; int _MaxCols = 1]’
/home/rosfun/code/MYNT-EYE-OKVIS-Sample/okvis_kinematics/include/okvis/kinematics/implementation/Transformation.hpp:172:61: required from here
/usr/include/eigen3/Eigen/src/LU/PartialPivLU.h:538:1: internal compiler error: Segmentation fault
MatrixBase::partialPivLu() const
^
Please submit a full bug report,
with preprocessed source if appropriate.
See file:///usr/share/doc/gcc-5/README.Bugs for instructions.
okvis_frontend/CMakeFiles/okvis_frontend.dir/build.make:86: recipe for target 'okvis_frontend/CMakeFiles/okvis_frontend.dir/src/VioKeyframeWindowMatchingAlgorithm.cpp.o' failed
make[2]: *** [okvis_frontend/CMakeFiles/okvis_frontend.dir/src/VioKeyframeWindowMatchingAlgorithm.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
CMakeFiles/Makefile2:855: recipe for target 'okvis_frontend/CMakeFiles/okvis_frontend.dir/all' failed
make[1]: *** [okvis_frontend/CMakeFiles/okvis_frontend.dir/all] Error 2
Makefile:127: recipe for target 'all' failed
make: *** [all] Error 2

error: ‘function’ is not a member of ‘std’;

i meet the following problem:
/home/slam/Documents/MYNT-EYE-OKVIS-Sample/build/okvis/brisk/src/brisk_external/src/brisk-feature-detector.cc:61:8: error: ‘function’ is not a member of ‘std’;
std::function<bool(const cv::KeyPoint& key_pt)> masking =

hope recieve your solution about this issue.

error when run okvis_app

I make this project successfully, but when I run the executable file, I got the problem below:
./okvis_app_mynteye_s: error while loading shared libraries: libmynteye.so.2: cannot open shared object file: No such file or directory
I checked /usr/local/lib and I'm sure there is the file libmynteye.so.2, I got confused and asking for help.

okvis is diverging !!

I run okvis on the EuroC Dataset and it was working properly, but once I tried it out with my MYNTEYE camera it diverged. I following MYNT-EYE-OKVIS-Sample steps.

Thanks in advncve
IMG_20190510_160503

How do I install the MYNT-EYE OKVIS module package with OKVIS ??? I'm already unknow the step and method when I have read the README.

excuse me, I'm using OKVIS doing VIO SLAM reasearch, the following README {#mainpage} document I have not understand yet.

####### 1 #######

the install package is MYNT-EYE-OKVIS-Sample, why the README paper use "okvis_mynt_eye_sample","okvis_app_mynteye_sample"??
what's the meaning:"Before you run okvis_mynt_eye_sample in OKVIS", and the following says "Run the okvis_app_mynteye_sample", where the install unzip package named "MYNT-EYE-OKVIS-Sample" ?????

######## 2 ########

"4 Run the okvis_app_mynteye_sample,
$ ./okvis_app_mynteye_sample path/to/okvis/config/config_mynteye.yaml 1"

I havn't found /okvis_app_mynteye_sample in the download unzip package "MYNT-EYE-OKVIS-Sample".
AND,
I also havn't found path/to/okvis/config/config_mynteye.yaml, I just found /MYNT-EYE-OKVIS-Sample/config/config_mynteye.yaml, where is wrong???

######## 3 ########

How do I install the MYNT-EYE OKVIS module package with OKVIS ??? I'm already unknow the step and method when I have read the README.

Please make a Detail Instruction , thank you.

Stops after the first several frames

Hi all,
I have compiled the MYNT-EYE-OKVIS. When I run it, after processing several frames, the MYNT-EYE-ROS-Wrapper stops publish camera data and outputs "select timeout: /dev/video1".

Does anyone run into this situation before, and know how to solve it?

Thanks!

mynteye/glog_init.h: No such file or directory

When building, came across this problem.
[ 98%] Building CXX object CMakeFiles/okvis_app_mynteye_sdk2.dir/okvis_apps/src/okvis_app_mynteye.cpp.o /home/lyh/MYNT-EYE-OKVIS-Sample/okvis_apps/src/okvis_app_mynteye.cpp:68:31: fatal error: mynteye/glog_init.h: No such file or directory compilation terminated.

Running on ARM platform

I've managed to compile the code on ARM and to make it run for a few frames but after that I get a segmentation error.
The change I've made is in okvis_frontend/src/Frontend.cpp:
I've replaced (line 822):
std::shared_ptr<cv::FeatureDetector>( //#ifdef __ARM_NEON__ new cv::GridAdaptedFeatureDetector( new cv::FastFeatureDetector(briskDetectionThreshold_), briskDetectionMaximumKeypoints_, 7, 4 ))); // from config file, except the 7x4...
with:
std::shared_ptr<cv::FeatureDetector>( #ifdef __ARM_NEON__ new brisk::BriskFeatureDetector(34,2)));
based on the OKVIS repo test code here

I'm running the program with this:
./okvis_app_mynteye_s ../config/config_mynteye_s.yaml

Before I made the change above, the program wouldn't compile because this repo uses OpenCV 3.0 which has a different definition for the FeatureDetector class than OpenCV 2.4 (see Original OKVIS repo issue here).

brisk::BriskFeatureDetector seems to be compatible with OpenCV 3... at least until I run in the segmentation problem.

I'm ready to put the work in to make this work on ARM. Do you have any suggestions on:

  1. how to force cmake to use OpenCV 2.4 to build the okvis_frontend module while using OpenCV3 for the other modules?
  2. using another FeatureDetector that would be compatible with OpenCV 3 to solve the segmentation problem?
  3. using the regular OKVIS repo and then repiping it so it can consume MYNTEYE frames?

Calibrate stereo camera error

cc@cc-desktop:~/kalibr_ws$ kalibr_bagcreater --folder ~/catkin_ws/src/MYNT-EYE-OKVIS-Sample/build/cameraimu/ --output-bag awsome.bag

Traceback (most recent call last):
File "/home/cc/kalibr_ws/devel/bin/kalibr_bagcreater", line 15, in
exec(compile(fh.read(), python_script, 'exec'), context)
File "/home/cc/kalibr_ws/src/kalibr-master/aslam_offline_calibration/kalibr/python/kalibr_bagcreater", line 121, in
imumsg, timestamp = createImuMessge(row[0], row[1:4], row[4:7])
File "/home/cc/kalibr_ws/src/kalibr-master/aslam_offline_calibration/kalibr/python/kalibr_bagcreater", line 95, in createImuMessge
rosimu.linear_acceleration.y = float(alpha[1])
IndexError: list index out of range
cc@cc-desktop:~/kalibr_ws$ kalibr_bagcreater --folder /catkin_ws/src/MYNT-EYE-OKVIS-Sample/build/cameraimu/ --output-bag awsome.bag
importing libraries
Traceback (most recent call last):
File "/home/cc/kalibr_ws/devel/bin/kalibr_bagcreater", line 15, in
exec(compile(fh.read(), python_script, 'exec'), context)
File "/home/cc/kalibr_ws/src/kalibr-master/aslam_offline_calibration/kalibr/python/kalibr_bagcreater", line 121, in
imumsg, timestamp = createImuMessge(row[0], row[1:4], row[4:7])
File "/home/cc/kalibr_ws/src/kalibr-master/aslam_offline_calibration/kalibr/python/kalibr_bagcreater", line 95, in createImuMessge
rosimu.linear_acceleration.y = float(alpha[1])
IndexError: list index out of range
cc@cc-desktop:
/kalibr_ws$ kalibr_bagcreater --folder ~/catkin_ws/src/MYNT-EYE-OKVIS-Sample/build/cameraimu/ --output-bag awsome.bag
importing libraries
Traceback (most recent call last):
File "/home/cc/kalibr_ws/devel/bin/kalibr_bagcreater", line 15, in
exec(compile(fh.read(), python_script, 'exec'), context)
File "/home/cc/kalibr_ws/src/kalibr-master/aslam_offline_calibration/kalibr/python/kalibr_bagcreater", line 121, in
imumsg, timestamp = createImuMessge(row[0], row[1:4], row[4:7])
File "/home/cc/kalibr_ws/src/kalibr-master/aslam_offline_calibration/kalibr/python/kalibr_bagcreater", line 95, in createImuMessge
rosimu.linear_acceleration.y = float(alpha[1])
IndexError: list index out of range

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.