Learning for Task and Motion Planning (LTAMP)
Robotic multi-step manipulation planning using both learned and engineered models of primitive actions.
Zi Wang*, Caelan Reed Garrett*, Leslie Pack Kaelbling, Tomás Lozano-Pérez. Learning compositional models of robot skills for task and motion planning, The International Journal of Robotics Research (IJRR), 2020.
Zi Wang, Caelan R. Garrett, Leslie P. Kaelbling, Tomás Lozano-Pérez. Active model learning and diverse action sampling for task and motion planning, International Conference on Intelligent Robots and Systems (IROS), 2018.
$ git clone --recursive [email protected]:caelan/LTAMP.git
$ cd LTAMP
LTAMP$ ./setup.bash
setup.py - compiles an IKFast analytical IK program for both of the PR2's
LTAMP$ cd control_tools/ik/
LTAMP$ control_tools/ik/$ python setup.py build
See README for details about using the existing and generating new IK solvers.
run_simulation.py: tests the planning module in simulation
LTAMP$ python -m plan_tools.run_simulation [-h] [-p PROBLEM] [-e] [-c] [-v]
collect_simulation.py: collects manipulation-primitive data in simulation
LTAMP$ python -m learn_tools.run_simulation [-h] [-p PROBLEM] [-e] [-c] [-v]
TBD
The planning module generates plans using the learned primitives.
Relevant planning submodules:
- pddlstream - Task and Motion Planning (TAMP)
- ss-pybullet - PyBullet Robot Planning
The learning module conducts manipulation-primitive data collection experiments and learns models from the collected data.
The control module provides an interface for executing both simulated and real motion.