Skip to content

ROS compatible package for object tracking based on SAM, Cutie, GroundingDINO and DEVA

Notifications You must be signed in to change notification settings

Kanazawanaoaki/tracking_ros

 
 

Repository files navigation

tracking_ros python_check

ROS1 package for detecting and tracking objects using SAM, Cutie, GroundingDINO and DEVA, inspired by detic_ros.

Usage

Tested : image of 480X640 30hz, 3090ti

Interactive prompt for generating mask and tracking object using SAM and Cutie.

sam_gui.mp4

sam_node publishes segmentation prompt which is used by cutie_node to track objects. It runs almost real-time (~30hz).

Detecting and tracking object using SAM, GroundingDINO and DEVA.

deva_example.mp4

deva_ndoe queries objects GroundingDINO and SAM at some intervals, so it can track new object after tracking is started. It runs ~15hz and you can adjust cfg['detection_every'] for performance. See node_scripts/model_config.py

Setup

Prerequisite

This package is build upon

  • ROS1 (Noetic)
  • catkin virtualenv (python>=3.9 used for DEVA)
  • (Optional) docker and nvidia-container-toolkit (for environment safety)

Build package

on your workspace

If you want build this package directly on your workspace, please be aware of python environment dependencies (python3.9 and pytorch is needed to build package).

mkdir -p ~/ros/catkin_ws/src && cd ~/ros/catkin_ws/src
git clone https://github.com/ojh6404/tracking_ros.git
wstool init
wstool merge -t . tracking_ros/rosinstall.noetic
wstool update -t . # jsk-ros-pkg/jsk_visualization for GUI
cd tracking_ros && ./prepare.sh
cd ~/ros/catkin_ws && catkin b

using docker (Recommended)

Otherwise, you can build this package on docker environment.

git clone https://github.com/ojh6404/tracking_ros.git
cd tracking_ros
docker build -t tracking_ros .

How to use

Please refer sample_track.launch and deva.launch

Tracking using SAM and Cutie with interactive gui prompt.

1. run directly

roslaunch tracking_ros sample_track.launch \
    input_image:=/kinect_head/rgb/image_rect_color \
    mode:=prompt \
    model_type:=vit_t \
    device:=cuda:0

2. using docker

You need to launch tracker and gui seperately cause docker doesn't have gui, so launch tracker by

./run_docker -host pr1040 -mount ./launch -name track.launch \
    input_image:=/kinect_head/rgb/image_rect_color \
    mode:=prompt \
    model_type:=vit_t \
    device:=cuda:0

where

  • -host : hostname like pr1040 or localhost
  • -mount : mount launch file directory for launch inside docker.
  • -name : launch file name to run

and launch rqt gui on your gui machine by

roslaunch tracking_ros sam_gui.launch

Detecting and tracking object.

roslaunch tracking_ros deva.launch \
    input_image:=/kinect_head/rgb/image_rect_color \
    classes:="monitor; keyboard; cup" \
    model_type:=vit_t \
    device:=cuda:0

or

./run_docker -host pr1040 -mount ./launch -name deva.launch \
    input_image:=/kinect_head/rgb/image_rect_color \
    classes:="monitor; keyboard; cup" \
    model_type:=vit_t \
    device:=cuda:0

TODO

About

ROS compatible package for object tracking based on SAM, Cutie, GroundingDINO and DEVA

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 89.9%
  • Dockerfile 6.8%
  • CMake 2.8%
  • Shell 0.5%