Photo of Adithya Murali

Artificial intelligence & robotics

Adithya Murali

Scalable frameworks for generating simulation data to overcome the data bottleneck in robotics.

Year Honored
2025

Organization
NVIDIA Research

Region
Asia Pacific

Adithya Murali's research addresses a core challenge in Physical AI: data scarcity. His innovative framework provides an effective path for training more general and robust robot models by generating large-scale synthetic data in simulated environments.

He led the research effort to develop both the tools and robotics applications of procedural scene generation, creating millions of realistic virtual environments in a scalable manner. Building on this, his neural network architecture, CabiNet, addresses complex object rearrangement and collision-free motion generation, demonstrating that models trained on synthetic data can effectively transfer to the real world. To propel the field forward, his team open-sourced the Scene Synthesizer library for procedural generation in 2025.

Previously, he also pushed the frontier of robot data collection in the real world. His work on “Robot Learning in Homes” provided one of the first scientific results of training AI models in unstructured settings like human homes. His opensource "PyRobot" and "LoCoBot" frameworks allowed Physical AI researchers to use low-cost mobile manipulators for both data collection and real-world validation for models trained in simulation.

Adithya's research has also had a significant real-world impact on industry. His recent research on simulation-based robotic grasping at NVIDIA, such as GraspGen, has allowed industrial robots with more precise and reliable grasping capabilities for tasks like machine tending on automated production lines.

His work has opened new directions in robot learning, demonstrating the feasibility of training "generalist-specialist" models on large-scale synthetic data on tasks which can be precisely modelled in a simulator. This paradigm has the potential to train physical intelligence that generalizes to varied robots, environments, and objects.