Photo of Manuel Le Gallo

Artificial intelligence & robotics

Manuel Le Gallo

He uses novel computer designs to make AI less power hungry.

Year Honored
2020

Organization
IBM RESEARCH

Region
Global

Hails From
Canada

Training a typical natural-language processor requires so much computing power that it emits as much carbon as the life span of five American cars. Training an image recognition model releases as much energy as a typical home puts out in two weeks—and it’s something that leading tech companies do multiple times a day. 

Much of the energy use in modern computing comes from the fact that data needs to be constantly transferred back and forth between memory and the processor. Manuel Le Gallo is working with a research team at IBM that’s building technology to enable new kinds of computing architecture that aims to be faster and more energy efficient but still highly precise. 

Le Gallo’s team developed a system that uses memory itself to process data, and his team’s early work has shown they can achieve both precision and huge energy savings. The team recently completed a process using just 1% as much energy as when the same process was performed with conventional methods.

As companies from the financial sector to life sciences constantly train their AI models to improve them, their energy needs will balloon. “What will change is we will be able to change models faster and more energy efficiently, which will definitely reduce the carbon footprint and energy spent training those models,” Le Gallo says.