Photo of Nicholas Harris

Computer & electronics hardware

Nicholas Harris

Shining light through optical chips might be the fastest way for neural networks to make decisions.

Year Honored



Hails From

For decades physicists and engineers have dreamed of making optical chips that use photons, not electrons, to do computing. Such circuits could be lightning fast and energy efficient. But making them work has been difficult.

In 2017, Nicholas Harris, together with Yichen Shen and other colleagues at MIT, published a widely cited paper describing a design that allowed them to calculate the outputs of neural networks that had been conventionally trained.

The paper describes a circuit of 56 programmable interferometers—devices that carefully break apart and recombine light waves. The circuits they created solved a simplified problem of recognizing vowels correctly—distinguishing about three quarters of the 180 cases they tried. This wasn’t as good as a conventional computer, which got over 90% of them right. Shortly thereafter, Shen and Harris launched competing startups.

Once a given neural network has been trained and implemented on an optical chip, performing inferences— figuring out which vowel corresponds to which sound, or how an autonomous car should react if a pedestrian steps into the street—can be almost as simple as shining light through it. This has the advantage of being both fast and energy efficient.

In March 2021, Lightmatter announced it would soon start selling a “machine-learning accelerator” chip. “It’s just a completely different kind of computer,” says Harris. “Right now we’re at about a factor of 20 times more efficient than the most advanced node in digital computers.” Lightmatter closed a second round of funding in May, bringing its total investment to $113 million.