Skip to content

AIRJASON50/force_based_ds_modulation

 
 

Repository files navigation

Force-based DS modulation

Build Status

This ROS package contains the software implementation used for the experimental evaluations of the work on: A Dynamical System-Based Approach to Motion and Force Generation for Robotic Manipulators in Contact Tasks

The authors are: Walid Amanhoud Mahdi Khoramshahi Aude Billard

Refer to http://lasa.epfl.ch/publications/publications.php for more info.

Videos of the experimental evaluations can be watched here: https://youtu.be/y6B5_FAWGN0.

The experiments are performed with KUKA LWR IV+ robots.

Installation

First, make sure that your ssh key are correctly set up. This would be needed for installing the dependencies

Go in the src directory of your catkin workspace and clone this package:

git clone https://github.com/walidAmanhoud/force_based_ds_modulation.git

Stay in the src directory and run the script install_dependencies.sh:

source force_based_ds_modulation/install_dependencies.sh

This script should install the package dependencies and compile everything.

Dependencies

The main dependencies are the following ones:

  • ROS: Robot operating system (indigo distribution)
  • CMake: Build system
  • Eigen: A library for linear algebra
  • libsvm: A library for Support Vector Machines (SVM)
  • SVMGrad: A ROS-package to evaluate SVM decision function and its derivatives
  • kuka-lwr-ros: A ROS-package to control the KUKA LWR IV+
  • mocap_optitrack: A ROS-package to work with the NaturalPoint Optitrack motion capture system
  • net-ft-sensor: A ROS-package to work with the ATI 6-axis force torque sensors
  • sg_differentiation: A ROS-package implementing Savitzky-Golay smoothing and differentiation
  • utils: A ROS-package implementing custom functions mainly for robot motion generation

Have a look to the dependencies.rosintall file in the package directory to get the package dependencies' addresses.

File hierarchy

The file system is divided in several subfolders:

  • cfg: contains .cfg files used by dynamic reconfigure
  • config: contains .yaml used by launch files
  • data_grasping: Contains the log files generated by the ObjectGrasping class
  • data_polishing: Contains the log files generated by the SurfacePolishing class
  • data_surface: Contains the log and model files generated by the SurfaceLearning class
  • data_workspace: Contains the workspace model files generated for the KUKA LWR IV+. Please, refer to https://github.com/sinamr66/SCA_data_construction to understand how these files can be generated
  • cmake: CMake FindXXX scripts
  • include: contains class header files
  • launch: contains .launch files
  • src: contains class implentations and source files to instantiate them:
    • SurfaceLearning: A class used to learn the model of a surface
    • SurfacePolishing: A class used to perform a "circular" polishing on a surface with a KUKA LWR IV+ robot
    • Workspace: A class used to evaluate the reachable workspace of the KUKA LWR IV+ robot
    • ObjectGrasping: A class used to reach, grasp and manipulate an object with two KUKA LWR IV+ robots
    • TwoRobotsTransform: A class publishing the TF transform between the two KUKA LWR IV+ robots (used for the object grasping task)

The data_grasping, data_polishing, and data_surface directories are generated automatically during the installation.

Setup the robot

Open a terminal and start a ROS Master

roscore

Open two other terminals and launch the files related to the KUKA robot. Follow the instructions in https://github.com/epfl-lasa/kuka-lwr-ros to properly setup the communication with the robot :

roslaunch lwr_simple_example real.launch
roslaunch lwr_fri lwr_fri_console.launch

You can change the gains of the DS-impedance controller using rqt_reconfigure. Open a new terminal and execute:

rosrun rqt_reconfigure rqt_reconfigure

Once everything is setup correctly you can start using this package.

Polishing task

Put markers around the basis of the robot and on the three corners of your surface as described below to define the surface reference frame:

1---------------2

| SURFACE |

3 ---------------

Start the optitrack tracking by executing in a new terminal:

roslaunch force_based_ds_modulation optitrack_surface_polishing.launch

Similarly, you need to setup the ATI 6 axis force torque sensor to stream the measured data:

roslaunch netft_rdt_driver ft_sensor.launch

You might need to learn the model of your surface if it is non-flat. It can be achieved using the surface_learning node. There is three available modes:

  • Collecting Data: It allows to collect datapoints on the surface. The user should bring the robot in contact with the surface and sweep it while applying a bit of force
    rosrun force_based_ds_modulation surface_learning 'filename' -m 'c'
    • filename : Specify the filename used to generate the model file
    • -m c : Select the collecting data mode
  • Learning Model: It learns the surface model using the collected data and generates the svmgrad model file
    rosrun force_based_ds_modulation surface_learning 'filename' -m 'l' -c 'C' -s 's' -e 'e' -g 'y/n' -u 'y/n'
    • filename : Specify the filename used to generate the model file
    • -m l : Select the learning mode
    • -c C : Specify the C parameter value of SVR
    • -s s : Specify the sigma value of the gaussian kernel used with SVR
    • -e e : Specify the epsilon tube width
    • -g y/n : Generate training data (might be needed once y, then you can set it to n for the next times if you just want to test other parameters for SVR while keeping the same training data)
    • -u y/n : The training data are built by generating random datapoints above and below the surface and estimating their normal distance to the surface using the collected datapoints. If set to y, this option allows to add the datapoints collected on the surface to the training data by setting the estimated normal to 0 for them.
  • Testing Model: It allows to test the model by aligning the end-effector of the robot with the normal to the surface estimated. It is advised to put the DS-impedance controller gains to 0 such that the user can freely move the robot around
    rosrun force_based_ds_modulation surface_learning 'filename' -m 't'
    • filename : Specify the filename used to generate the model file
    • -m t : Select the testing model mode

To use the model learned in the polishing task you need to copy and paste the model file generated as follows:

cp force_based_ds_modulation/data_surface/'filename'_svmgrad_model.txt  force_based_ds_modulation/data_surface/learned_surface_svmgrad_model.txt

Where filename should match the filename you used to learn the model of the surface. Note also that the command above assumes that you are in the src directory of your catkin workspace, so adapt it in consequence !

You are now ready to execute the surface_polishing node:

rosrun force_based_ds_modulation surface_polishing 'filename' -s 'p/n' -v 'v' -f 'f'
  • filename : Specify the filename used to log the experimental data
  • -s p/n : Select the surface type (p for planar or n for non-flat)
  • -v v : Specify the target velocity of the nominal DS in m/s
  • -f f : Specify the target force in contact in N

WARNING !!!: The commands above should be executed without the apostrophes !!!!

Reaching, grasping and manipulation task

You need first to setup the two robots. Two computers are required, one for each robot. Define the ROS Master on the first computer and set the ROS_MASTER_URI of the second computer to the IP Address of the first one. By doing so, you can communicate between the two machines through the ROS network.

Proceed then as described previously (section: Setup the robot) to setup the communication for each robot. For the second robot however, change the branch of the kuka-lwr-ros package to dev_controllers_two_robots and recompile everything before launching the files. It gives a different namespace to the second robot while keeping the same variables and topics names.

Similarly to the polishing task, you need to put makers around the basis of each robot and on the object to grasp. Four markers are required for the object. Assuming that the object is a box, the markers can be put on the top face:

2------------3

| OBJECT |

1 ------------4

Start the optitrack tracking by executing in a new terminal:

roslaunch force_based_ds_modulation optitrack_object_grasping.launch

Setup the ATI 6 axis force torque sensors attached to the robots by running the command below:

roslaunch netft_rdt_driver ft_2_sensors.launch

When the optitrack tracking is running, you can use the two_robots_transform node to publish the transformation between the robots' basis. This is used for vizualising the two robots in RVIZ:

rosrun force_based_ds_modulation two_robots_transform -m 's/r'
  • -m s/r : Select the mode (s for simulation or r for real experiments)

Once everything is ready, you can perform the reaching, grasping and manipulation task by executing:

rosrun force_based_ds_modulation object_grasping 'filename' -m 'rg/rgm' -f 'f'
  • filename : Specify the filename used to log the experimental data
  • -m rg/rgm : Select the surface type (rg for reaching and grasping or rgm for reaching, grasping and moving the object to a predfefined position, that can be offseted afterwards using rqt_reconfigure)
  • -f f : Specify the target force that both robots should apply in contact with the object's surface in N

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • C++ 93.6%
  • CMake 3.9%
  • Shell 1.3%
  • Python 1.2%