Skip to content

The Large-scale Manipulation Platform for Scalable and Intelligent Embodied Systems

Notifications You must be signed in to change notification settings

OpenDriveLab/AgiBot-World

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

85 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AgiBot World Colosseo is a full-stack large-scale robot learning platform curated for advancing bimanual manipulation in scalable and intelligent embodied systems. It is accompanied by foundation models, benchmarks, and an ecosystem to democratize access to high-quality robot data for the academic community and the industry, paving the path towards the "ImageNet Moment" for Embodied AI.

We have released:

  • Task Catalog: Reference sheet outlining the tasks in our dataset, including robot end-effector types, sample action-text descriptions and more
  • AgiBot World Beta: Our complete dataset featuring 1,003,672 trajectories (~43.8T)
  • AgiBot World Alpha: A curated subset of AgiBot World Beta, containing 92,214 trajectories (~8.5T)

News📰

  • [2025/03/10] 📄 Research Blog and Technical Report released.
  • [2025/03/01] Agibot World Beta released.
  • [2025/01/03] Agibot World Alpha Sample Dataset released.
  • [2024/12/30] 🤖 Agibot World Alpha released.

TODO List 📅

  • AgiBot World Alpha
  • AgiBot World Beta (expected Q1 2025)
    • ~1,000,000 trajectories of high-quality robot data
  • AgiBot World Foundation Model: GO-1 (expected Q2 2025)
    • Training & inference code
    • Pretrained model checkpoint
  • AgiBot World Colosseo (expected 2025)
    • A comprehensive platform with toolkits including teleoperation, training and inference.
  • 2025 AgiBot World Challenge (expected 2025)

Key Features 🔑

  • 1 million+ trajectories from 100 robots.
  • 100+ 1:1 replicated real-life scenarios across 5 target domains.
  • Cutting-edge hardware: visual tactile sensors / 6-DoF Dexterous hand / mobile dual-arm robots
  • Wide-spectrum versatile challenging tasks
Contact-rich Manipulation

Contact-rich Manipulation

Long-horizon Planning

Long-horizon Planning

Multi-robot Collaboration

Multi-robot Collaboration

Table of Contents

  1. Key Features
  2. At a Quick Glance
  3. Getting Started
  4. TODO List
  5. License and Citation

At a Quick Glance⬇️

Follow the steps below to quickly explore and get an overview of AgiBot World with our sample dataset (~7GB).

# Installation
conda create -n agibotworld python=3.10 -y
conda activate agibotworld
pip install git+https://github.com/huggingface/lerobot@59e275743499c5811a9f651a8947e8f881c4058c
pip install matplotlib
git clone https://github.com/OpenDriveLab/AgiBot-World.git
cd AgiBot-World

# Download the sample dataset (~7GB) from Hugging Face. Replace <your_access_token> with your Hugging Face Access Token. You can generate an access token by following the instructions in the Hugging Face documentation from https://huggingface.co/docs/hub/security-tokens
mkdir data
cd data
curl -L -o sample_dataset.tar -H "Authorization: Bearer <your_access_token>" https://huggingface.co/datasets/agibot-world/AgiBotWorld-Alpha/resolve/main/sample_dataset.tar
tar -xvf sample_dataset.tar

# Convert the sample dataset to LeRobot dataset format and visualize
cd ..
python scripts/convert_to_lerobot.py --src_path ./data/sample_dataset --task_id 390 --tgt_path ./data/sample_lerobot
python scripts/visualize_dataset.py --task-id 390 --dataset-path ./data/sample_lerobot

Getting started 🔥

Installation

Download our source code:

git clone https://github.com/OpenDriveLab/AgiBot-World.git
cd AgiBot-World

Our project is built upon the lerobot library (dataset v2.0, commit 59e2757), install lerobot through

pip install git+https://github.com/huggingface/lerobot@59e275743499c5811a9f651a8947e8f881c4058c

How to Get Started with Our AgiBot World Data

pip install openxlab # install CLI
openxlab dataset get --dataset-repo OpenDriveLab/AgiBot-World # dataset download
huggingface-cli download --resume-download --repo-type dataset agibot-world/AgiBotWorld-Alpha --local-dir ./AgiBotWorld-Alpha

Convert the data to LeRobot Dataset format.

python scripts/convert_to_lerobot.py --src_path /path/to/agibotworld/alpha --task_id 390 --tgt_path /path/to/save/lerobot

Visualize Datasets

We adapt and extend the dataset visualization script from LeRobot Project

python scripts/visualize_dataset.py --task-id 390 --dataset-path /path/to/lerobot/format/dataset

It will open rerun.io and display the camera streams, robot states and actions, like this:

Policy Training Quickstart

Leveraging the simplicity of LeRobot Dataset, we provide a user-friendly Jupyter Notebook for training diffusion policy on AgiBot World Dataset.

License and Citation📄

All the data and code within this repo are under CC BY-NC-SA 4.0.

  • Please consider citing our work if it helps your research.
  • For the full authorship and detailed contributions, please refer to contributions.
  • In alphabetical order by surname:
@article{contributors2025agibotworld,
  title={AgiBot World Colosseo: A Large-scale Manipulation Platform for Scalable and Intelligent Embodied Systems},
  author={AgiBot-World-Contributors and Bu, Qingwen and Cai, Jisong and Chen, Li and Cui, Xiuqi and Ding, Yan and Feng, Siyuan and Gao, Shenyuan and He, Xindong and Hu, Xuan and Huang, Xu and Jiang, Shu and Jiang, Yuxin and Jing, Cheng and Li, Hongyang and Li, Jialu and Liu, Chiming and Liu, Yi and Lu, Yuxiang and Luo, Jianlan and Luo, Ping and Mu, Yao and Niu, Yuehan and Pan, Yixuan and Pang, Jiangmiao and Qiao, Yu and Ren, Guanghui and Ruan, Cheng and Shan, Jiaqi and Shen, Yongjian and Shi, Chengshi and Shi, Mingkang and Shi, Modi and Sima, Chonghao and Song, Jianheng and Wang, Huijie and Wang, Wenhao and Wei, Dafeng and Xie, Chengen and Xu, Guo and Yan, Junchi and Yang, Cunbiao and Yang, Lei and Yang, Shukai and Yao, Maoqing and Zeng, Jia and Zhang, Chi and Zhang, Qinglin and Zhao, Bin and Zhao, Chengyue and Zhao, Jiaqi and Zhu, Jianchao},
  journal={arXiv preprint arXiv:2503.06669},
  year={2025}
}