Photo of Xu Han

Artificial intelligence & robotics

Xu Han

Through the open-source community OpenBMB, he is empowering global AI innovation.

Year Honored
2024

Organization
Tsinghua University

Region
China

Hails From
China
Xu Han has long been dedicated to research in natural language processing, knowledge engineering, and large language models (LLMs), aiming to drive both innovation and adoption of AI.

To address the bottlenecks in sustainable development, specifically in training data acquisition and computational resource consumption revealed by the Scaling Law, Xu and his team introduced the Densing Law, which shows that a model’s capability density grows exponentially over time. In other words, to maintain the same model capabilities, the scale of parameters required by the model is constantly decreasing. It offers a new theoretical foundation and optimization path for the development of LLMs.

Based on the driving force of capability density, Xu and his team developed the edge-side LLM series MiniCPM, which greatly reduced the computational overhead for LLM deployment. In addition, the MiniCPM series of edge-side LLMs have been downloaded over 4 million times, gaining widespread recognition from the communities like GitHub and HuggingFace.

Committed to open knowledge sharing, Xu co-founded OpenBMB, a thriving open-source community where over 200 LLMs have been developed using its open-source models, datasets, and toolkits, supporting efficient model training and deployment worldwide.

Xu’s contributions have significantly lowered the cost and accessibility barriers of LLM deployment, accelerating the global reach and impact of AI technologies.