Skip to content

Smilels/multimodal-translation-teleop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A Mobile Robot Hand-Arm Teleoperation System by Vision and IMU

Author's mail: [email protected]

This package presents a multimodal mobile teleoperation system that consists of a novel vision-based hand pose regression network (Transteleop) and an IMU-based arm tracking method.

Please cite this paper (A Mobile Robot Hand-Arm Teleoperation System by Vision and IMU), if you use our released code.

Prerequisites

OS

  • ROS Kinetic and Ubuntu 16.04
  • CUDA 10

Setup

  1. Install Anaconda and PyTorch.
  2. Install python packages in a new environment
    conda upgrade --all
    conda create -n teleop python=[PYTHON_VERSION] numpy ipython matplotlib mayavi yaml lxml seaborn pyyaml 
    conda activate teleop
    pip install rospkg numpy-stl tensorboardx pyquaternion pyassimp==4.1.3 visdom dominate
    Note: PYTHON_VERSION can be 3.7 if you do not need to use this package with ROS, otherwise use 2.7 or use python at /usr/bin/python
  3. Clone this repo:
    git clone https://github.com/Smilels/multimodal-translation-teleop.git

Dataset Generation

Model Training/testing

  • If you want to train the network yourself instead of using a pre-trained model, follow the below steps.

  • To prepare your dataset, and revise the corresponding options in scripts/ae_joint_train.sh.

  • To run an experiment for 200 epoch:

    cd Transteleop
    bash scripts/ae_joint_train.sh
    

    If you want to train pix2pix baseline, in scripts, there has train_pix2pix.sh.

  • To view training results and loss plots, run

    python -m visdom.server

    and click the URL http://localhost:8097.

  • To view the accuracy results, Launch a tensorboard for monitoring:

    tensorboard --log-dir ./assets/log --port 8080
  • Test the model (bash ./scripts/test_aejoint.sh):

  • The test results will be saved to a html file here: ./results/YOUR_MODEL_NAME/test_latest/index.html.

RealsenseSR300 Demo on simulated hand

  • Launch camera RealsenseSR300(realsense-ros, librealsense) (If you use the other camera which is suitable for close-range tracking, please use corresponding launch file).

  • If you do not have a real camera, you can download the recorded example rosbag, and play the bag file:

    rosbag play [-l] example.bag
  • Run Shadow hand in simulation.

  • Run the demo code.

    • Change the correct topic name based on your camera.
    • Limit your right hand to the viewpoint range of [30°, 120°] and the distance range of [15mm, 40mm] from the camera.
    bash scripts/demo_moveit.sh

Citation

If you use this code for your research, please cite our papers.

@inproceedings{li2020Mobilerobot,
title={A Mobile Robot Hand-Arm Teleoperation System by Vision and IMU},
author={Li, Shuang and Jiang, Jiaxi and Ruppel, Philipp and Liang, Hongzhuo and Ma, Xiaojian and Hendrich, Norman and Sun, Fuchun and Zhang, Jianwei},
journal={arXiv preprint arXiv:2003.05212},
year={2020}
}
@inproceedings{li2018vision,
  title={Vision-based Teleoperation of Shadow Dexterous Hand using End-to-End Deep Neural Network},
  author={Li, Shuang and Ma, Xiaojian and Liang, Hongzhuo and G{\"o}rner, Michael and Ruppel, Philipp and Fang, Bing and Sun, Fuchun and Zhang, Jianwei},
  booktitle={IEEE International Conference on Robotics and Automation (ICRA)},
  year={2019}
}

Related Projects

Our code is inspired by

About

A Mobile Robot Hand-Arm Teleoperation System by Vision and IMU

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published