Skip to content

Codes of paper "GraspSAM: When Segment Anything Model meets Grasp Detection", ICRA 2025

Notifications You must be signed in to change notification settings

gist-ailab/GraspSAM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GraspSAM

Sangjun Noh, Jongwon Kim, Dongwoo Nam, Seunghyeok Back, Raeyong kang, Kyoobin Lee

This repository contains source codes for the paper "GraspSAM: When Segment Anything Model meets Grasp Detection." (ICRA 2025) [ArXiv] [Project Website]

Getting Started

Environment Setup

Tested on Titan RTX with python 3.8x, pytorch 2.0.1, torchvision 0.15.2, CUDA 11.7

  1. Download source codes
git clone https://github.com/gist-ailab/GraspSAM.git
cd GraspSAM
  1. Set up a python environment
conda create -n GraspSAM python=3.8
conda activate GraspSAM
pip install -r requirements.txt

Download grasp detection benchmarks

  1. Download Jacquard dataset at [Jacquard]
  2. Download Grasp-Anything dataset at [Grasp-Anything]
  3. Extract the downloaded datasets and organize the folders as follows
GraspSAM
└── datasets
       ├── Jacqurd_Dataset
       │     └──Jacquard_Dataset_0
       │     └──...
       │     └──Jacquard_Dataset_11
       └── Grasp-Anything
             └──grasp_label_positive
             └──grasp_label_negative
             └──image
             └──mask
             └──scene_description
       

Download pretrained checkpoints for SAM families

  1. Download Efficient SAM checkpoint at Efficient SAM
  2. Download Mobile SAM checkpoint at Mobile SAM
  3. Make the pretrained_checkpoint folder and move the downloaded checkpoints to the folder
GraspSAM
└── pretrained_checkpoint
       ├── mobile_sam.pt
       ├── efficient_sam
             └──efficient_sma_vitt.pt
             └──... 

Train & Evaluation

Train on Jacquard

python train.py --root {JACQUARD_ROOT} --save --sam-encoder-type {BACKBONE_TYPE}

Train on Grasp-Anything

python train.py --root {GRASP_ANYTHING_ROOT} --save --sam-encoder-type {BACKBONE_TYPE}

Evaluation on Jacquard

python eval.py --root {JACQUARD_ROOT} --ckp_path {CKP_PATH}

Evaluation on Grasp-Anything

python eval.py --root {GRASP_ANYTHING_ROOT} --ckp_path {CKP_PATH}

License

The source code of this repository is released only for academic use. See the license file for details.

Notes

The codes of this repository are built upon the following open sources. Thanks to the authors for sharing the code!

Citation

If you use our work in a research project, please cite our work:

  title={GraspSAM: When Segment Anything Model Meets Grasp Detection},
  author={Noh, Sangjun and Kim, Jongwon and Nam, Dongwoo and Back, Seunghyeok and Kang, Raeyoung and Lee, Kyoobin},
  journal={arXiv preprint arXiv:2409.12521},
  year={2024}
}

About

Codes of paper "GraspSAM: When Segment Anything Model meets Grasp Detection", ICRA 2025

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages