← Back
Ai2 releases MolmoBot and MolmoSpaces, achieving zero-shot sim-to-real transfer for robot manipulation
· releasefeatureopen-sourcemodelplatform · allenai.org ↗

Zero-Shot Sim-to-Real Transfer Breakthrough

Ai2 announced a major advance in robotics: models trained entirely in simulation that transfer directly to real-world robots without additional manually-collected data or fine-tuning. This zero-shot sim-to-real transfer removes a major bottleneck that has historically required months of teleoperated real-world demonstrations to achieve reliable robot performance. The breakthrough challenges a core assumption in the field and demonstrates that simulation diversity alone can enable practical real-world robot capability.

MolmoSpaces: Open Simulation Infrastructure

MolmoSpaces is an open ecosystem for embodied AI research that consolidates simulation infrastructure into a unified platform:

  • 230,000+ indoor scenes for diverse training environments
  • 130,000+ curated object assets with varied properties
  • 42 million physics-grounded robotic grasp annotations
  • Support for systematic variation of lighting, physics, articulation, and task definitions
  • Integration with widely-used simulators including MuJoCo, NVIDIA Isaac Lab, and NVIDIA Isaac Sim

By releasing assets, tools, and infrastructure openly, Ai2 is democratizing robotics research and making it reproducible across institutions.

MolmoBot: Practical Zero-Shot Manipulation

MolmoBot is a fully open manipulation model suite trained entirely on synthetic data from MolmoSpaces. It demonstrates the zero-shot approach in practice across multiple robot systems:

  • Pick and place tasks on unseen objects and environments
  • Articulated object manipulation (drawers, cabinets)
  • Door opening and similar navigation tasks
  • Works across different robot morphologies, including mobile manipulators

Crucially, MolmoBot achieves this without photorealistic rendering, task-specific fine-tuning, or real-world demonstration data. Ai2's evaluations show that breadth of simulation diversity outweighs repetition, indicating that richer virtual environments are more important than simply scaling the same scenarios.

Implications for Robotics Research

This shift restructures the robotics development pipeline. Instead of months of cumbersome manual data collection, researchers can focus on designing diverse simulated environments that scale with computational resources. This makes robotics research faster, more reproducible, and more accessible to labs and companies without proprietary robot platforms or large demonstration datasets.

All components—models, simulation infrastructure, grasp annotations, data generation pipelines, and benchmarking tools—are released as open source, enabling the broader research community to build, test, and improve physical AI systems collaboratively.