Git Product home page Git Product logo

superquadric-grasp-demo's Introduction

superquadric-grasp-demo

This demo performs object modeling and grasping with superquadrics [1] with the iCub robot.

Overview

Demo description

This module implements a wrapper code for performing the superquadric modeling and grasping pipeline described in [1]. This wrapper code communicates with existing modules developed in robotology and its structure can be summarized as follows:

  1. The robot checks if abox, sphere or cylinder is in the field of view by querying the objectPropertyCollector and acquires the 2D bounding box information of the object. If multiple objects are in front the robot, one of them is randomly chosen for performing the demo. Note: The objects are classified in box, sphere or cylinder since they are primary shapes used for improving the object modeling.
  2. Given that, lbpExtract module provides multiple 2D blobs of the selected object. The demo code sends the 2D blobs of the object to the Structure From Motion module for getting the relative 3D point clouds.
  3. The 3D point clouds are then sent to the superquadric-model for computing several superquadrics modeling the object.
  4. A median filter computes a model by averaging all the reconstructed superquadrics. This approach leads to more stable and robust models.
  5. The demo code sends the estimated superquadric to the superquadric-grasp module, which computes suitable poses for the right and the left hand.
  6. The best hand for grasping the object is selected according to proper criteria.
  7. Finally, the superquadric-grasp is asked to execute grasping.
  8. Once grasped, the object is lifted.

Here is a video of the running pipeline:Go to the top

More information are available in these slides.

Use case

How to compile

In Linux systems, the code can be compiled as follows:

git clone https://github.com/robotology/superquadric-grasp-demo.git
mkdir build; cd build
ccmake ..
make install

Go to the top

Dependencies

The superquadric-grasp-demo is a wrapper code which communicates with several modules in order to coordinate them for the execution of the demo:

Go to the top

How to run the code

This demo has been designed in order to be automatically executed on the iCub robot. If you are interested in a interactive mode for launching the grasping algorithm, please visit the superquadric-grasp-example repository.

In order to automatically run the superquadric-grasp-demo, please:

  1. Launch the yarprobotinterface.
  2. Launch the cameras.
  3. Launch the basic modules:iKinGazeCtrl, iKinCartsianSolver- for both right and left arm. For a safe approach during the grasping, we recommend to launch also wholeBodyDynamics and gravityCompensator. We use in fact the estimate of the forces measured at the end-effector in order to stop the movement in case of collisions.
  4. Launch the skinManager and skinManagerGui and connect. Set the binarization filter off and the compensation gain and the contact compensation gain at the minimum values. If you do not want use the skin, we also provide a FingersPositionControl.
  5. Launch and connect all the modules required from the demo, which are collected in the iCub_Grasp_Demo xml.
  6. The rfsmGui will open. Play run on the Gui to start the state machine executing the demo. More information about the Superquadric_Grasp_Demo state machine are provided here.

Go to the top

Setting up the demo

Before running the demo, it is recommended to correctly set up the modules. In particular:

Go to the top

How to customize the demo

The superquadric-grasp-demo can be customized by the user by changing the configuration parameters of the superquadric-model and superquadric-grasp modules, in the proper configuration files (respectively: config-classes.ini and config.ini)).

Some useful options for the superquadric-grasp module are the following:

  • lift_object: available values: on (default) / off. If you want the robot to test if the pose is good enough for lifting the object.
  • grasp: available values: on (default) / off. If you want the robot to perform the grasp by using tactile feedback. If off is selected, the robot just reaches the desired pose, without closing the hand.
  • visual_servoing: available values: on / off (default). If you want to reach for the pose by using a markerless visual-servoing algorithm and an accurate hand pose estimation (more information are available here). (Currently, visual-servoing is available only for the right hand). In the superquadric-grasp repository we provide more information on how the visual-servoing approach is used for a fine reaching for the final pose.
  • which_hand: available values: right, left, or both (default). This variable represents the hand for which the grasping pose is computed. In case both is selected, the pose is computed for each hand and the best hand for grasping the object is automatically selected. If only one hand is chosen for the grasping pose computation, it will be consequently selected also for grasping the object.

Note: in case the visual_servoing is on, the entire pipeline is slightly different:

Once the grasping pose is computed, the robot reaches in open loop an intermediate pose (S4). Then, a visual particle filter estimates the current robot hand pose (S5) and this is information is used by a visual-servoing controller in order to reach for the desired pose and grasp the object (S6).

Documentation

The online documentation of this module is available here.

Go to the top

References

[1] G. Vezzani, U. Pattacini and L. Natale, "A grasping approach based on superquadric models", IEEE-RAS International Conference on Robotics and Automation 2017, pp 1579-1586.

superquadric-grasp-demo's People

Contributors

giuliavezzani avatar pattacini avatar

Watchers

James Cloos avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.