The official implementation of the ICRA2023 paper EXOT: Exit-aware Object Tracker for Safe Robotic Manipulation of Moving Object
Use the Anaconda
conda create -n exot python=3.8
conda activate exot
bash install_exot.sh
Put the tracking datasets in ./data. It should look like:
${EXOT_ROOT}
-- data
-- TREK-150
|-- P03
|-- P05
|-- P06
...
-- robot-data
-- data_RGB
|-- auto
|-- human
You can download TREK-150 dataset in https://github.com/matteo-dunnhofer/TREK-150-toolkit.
The trained models and UR5e-made RMOT-223 dataset are provided in the https://huggingface.co/hsroro/EXOT, https://huggingface.co/datasets/hsroro/RMOT-223.
Run the following command to set paths for this project
python tracking/create_default_local_file.py --workspace_dir . --data_dir ./data --save_dir .
After running this command, you can also modify paths by editing these two files
lib/train/admin/local.py # paths about training
lib/test/evaluation/local.py # paths about testing
Training with multiple GPUs using DDP
python tracking/train.py --script exot_st1 --config baseline_robot --save_dir . --mode multiple --nproc_per_node 8 # EXOT Stage1
python tracking/train.py --script exot_st2 --config baseline_robot --save_dir . --mode multiple --nproc_per_node 8 --script_prv exot_st1 --config_prv baseline_robot # EXOT Stage2
(Optionally) Debugging training with a single GPU
python tracking/train.py --script exot_st1 --config baseline_robot --save_dir . --mode single # EXOT Stage1
python tracking/train.py --script exot_st2 --config baseline_robot --save_dir . --mode single --script_prv exot_st1 --config_prv baseline_robot # EXOT Stage2
Refer to benchmarking/train_pl.sh
or benchmarking/train.sh
for detailed commands.
python tracking/test.py exotst_tracker baseline_mix_lowdim --dataset robot_test
python tracking/analysis_results.py --tracker_param baseline_mix_lowdim --dataset robot_test --name exotst_tracker
For more config options and further details, see benchmarking/test.sh
.
- Evaluate using UR5e robot
python tracking/video_demo.py exotst_tracker baseline_mix_lowdim --track_format run_video_robot --modelname exot_merge
For more config options and further details, see tracking/run_video_demo.sh
.
- For automatic creation of pick and place dataset using a hand camera with UR5e, see
data_preparation
folder. - For human collection of pick and place dataset, buy a 3D Mouse and a running program created by Radalytica.
- We use the implementation of the STARK from the official repo https://github.com/researchmm/Stark.