Skip to content
@OpenRobotLab

OpenRobotLab

The open source platform of Embodied AI at Shanghai AI Laboratory - Robot Learning, Multimodal Learning, and Embodied AI.
 
OpenRobotLab is actively exploring a path to building general embodied intelligent systems through high-fidelity simulation and real-world interaction. We focus on six main research fields:
  • 🤖 Humanoids and Motion Intelligence: Development of algorithms and control systems that enable humanoid robots to move naturally and adapt to various environments.

  • Manipulation and Navigation Intelligence: Creation of AI systems that can understand and manipulate objects while navigating complex environments.

  • 🛠️ Simulation/Hardware Platforms for Embodied AI: Construction and maintenance of simulation environments and hardware platforms that support embodied AI development and testing.

  • AIGC for Embodied AI: Development of AI-generated content for embodied systems to enhance robot learning and adaptation capabilities.

  • 👀 3D Vision and Embodied Perception: Development of computer vision systems that enable robots to perceive and understand their three-dimensional environment in real-time.

  • 🧠Foundation Models for Embodied AI: Creation of large-scale pre-trained models designed specifically for embodied intelligence to enable cross-task and cross-environment transfer learning.

Dive into the repos below. Welcome!

Pinned Loading

  1. GRUtopia Public

    GRUtopia: Dream General Robots in a City at Scale

    Python 787 44

  2. OpenHomie Public

    Open-sourced code for "HOMIE: Humanoid Loco-Manipulation with Isomorphic Exoskeleton Cockpit".

    C++ 261 20

  3. PointLLM Public

    [ECCV 2024 Best Paper Candidate] PointLLM: Empowering Large Language Models to Understand Point Clouds

    Python 786 38

  4. EmbodiedScan Public

    [CVPR 2024 & NeurIPS 2024] EmbodiedScan: A Holistic Multi-Modal 3D Perception Suite Towards Embodied AI

    Python 590 44

  5. HIMLoco Public

    Learning-based locomotion control from OpenRobotLab, including Hybrid Internal Model & H-Infinity Locomotion Control

    Python 484 48

  6. Seer Public

    [ICLR 2025 Oral] Seer: Predictive Inverse Dynamics Models are Scalable Learners for Robotic Manipulation

    Python 172 10

Repositories

Showing 10 of 26 repositories
  • VLM-Grounder Public

    [CoRL 2024] VLM-Grounder: A VLM Agent for Zero-Shot 3D Visual Grounding

    Python 101 1 1 0 Updated May 3, 2025
  • PPI Public

    [RSS 2025] Gripper Keypose and Object Pointflow as Interfaces for Bimanual Robotic Manipulation

    Python 36 MIT 0 0 0 Updated Apr 27, 2025
  • HoST Public

    [RSS 2025] 💐Official implementation of the paper "Learning Humanoid Standing-up Control across Diverse Postures"

    Python 155 MIT 6 8 0 Updated Apr 25, 2025
  • GRUtopia Public

    GRUtopia: Dream General Robots in a City at Scale

    Python 787 MIT 44 11 0 Updated Apr 21, 2025
  • Seer Public

    [ICLR 2025 Oral] Seer: Predictive Inverse Dynamics Models are Scalable Learners for Robotic Manipulation

    Python 172 Apache-2.0 10 2 1 Updated Apr 21, 2025
  • PointLLM Public

    [ECCV 2024 Best Paper Candidate] PointLLM: Empowering Large Language Models to Understand Point Clouds

    Python 786 38 2 0 Updated Apr 21, 2025
  • RoboSplat Public

    [RSS 2025] Novel Demonstration Generation with Gaussian Splatting Enables Robust One-Shot Manipulation

    76 Apache-2.0 1 0 0 Updated Apr 17, 2025
  • UniHSI Public

    [ICLR 2024 Spotlight] Unified Human-Scene Interaction via Prompted Chain-of-Contacts

    Python 206 11 2 0 Updated Apr 13, 2025
  • OpenHomie Public

    Open-sourced code for "HOMIE: Humanoid Loco-Manipulation with Isomorphic Exoskeleton Cockpit".

    C++ 261 20 0 0 Updated Apr 4, 2025
  • Aether Public

    Aether: Geometric-Aware Unified World Modeling

    Python 297 MIT 2 1 0 Updated Mar 31, 2025

People

This organization has no public members. You must be a member to see who’s a part of this organization.