Light-Enabled Computers Could Help AI’s Power Crisis

Computers that use light instead of circuits to do math may sound like something out of a Star Trek episode, but researchers have been working on this novel method of computing for years.
They’re called virtual computers, and labs around the world are exploring how they can be useful in everyday life.
On Wednesday, a team of researchers from Penn State published a paper in the journal Science Advances that examines how optical computing can reduce the energy consumption of artificial intelligence systems.
Xingjie Ni, an engineering professor at Penn State and one of the authors of the paper, told CNET that the work is a proof of concept for how optical computing can benefit the rapidly growing AI industry in the future.
“Sometimes progress comes from rethinking conventional physics with a new principle,” Ni said. “By revising classic ideas in optics through the lens of modern AI challenges, we can open new directions for faster, greener computing.”
It powers AI
As AI is increasingly adopted for use at work and at home, the issue of AI power costs is important. It takes a lot of computing power to run AI products and services like ChatGPT, and a lot of power is used in the process.
You may live in or near a city where a technology company plans to build a data center, or your monthly utility bill may increase due to high demand on the local power grid.
The International Energy Agency estimates that data centers will account for approximately 1.5% of global energy consumption by 2024 and that this figure has increased by 12% per year over the past five years. The IEA also estimates that data center energy consumption could double by 2030.
That’s why using an alternative method of computing to reduce the power used by AI is an attractive prospect.
Easy speed
Virtual computers — computers that use light instead of electricity — are still largely in the moonshot phase of the technology industry, where they are years away from commercial use. It’s been a concept since the 1960s, with the roots of visual information processing going much further back.
Virtual reality computers are usually provided by research laboratories. But optical data transmission, which rapidly transmits data using pulses of light, is used today in some large data centers and ground-to-plane transmissions.
However, applying optical computing to artificial intelligence is an emerging field of research. There are real challenges in getting light to work together to perform the tasks required by neural networks, which are part of the AI used in products like today’s chatbots.
Actually, light naturally travels in a straight line. To build a computer that can process data, you need an optical system that generates nonlinear functions. In order for optical computers to do this, they often need other things that can be difficult to implement and consume a lot of energy.
“True optical nonlinearity is often weak and difficult to achieve — it usually requires high-power lasers or special materials, which add complexity and can undermine the benefits of the optics’ energy efficiency,” Ni said. “Our approach avoids those requirements while still delivering the same performance as wireless digital networks.”
An infinity mirror
Researchers at Penn State have discovered an interesting solution that could help virtual computers perform non-linear tasks that are better suited to the type of data processing needs of AI.
The prototype the team built uses an “infinity mirror” setup that includes “tiny optical elements, which encode data directly into light rays,” creating an indirect relationship over time. Then, the light patterns are captured with a small camera.
“The main takeaway is that a carefully designed optical structure can produce the necessary AI without relying on nonlinear solids or high-power lasers,” Ni said. “By allowing the light to ‘reverse’ into the system, we generate this indirect map while keeping the hardware simple, low power, and fast.”
The figure above shows how light is focused into a small processing unit, allowing a large stream of computer information to be transferred without the use of power-hungry circuitry. Another picture (below) shows how the group process works conceptually. The light input is repeatedly reflected by lenses and other optical devices, encoded with complex strings of information, and finally focused on a camera that provides a simplified output.
It’s an interesting concept, but turning a prototype into a system with real-world applications will take a lot of time, work and money.
From the lab to the data center
Ni admits that AI-powered computers are still years away.
“The realistic timeline to reach an industry-facing prototype and early demonstrations is about two to five years, depending on the level of investment and the intended application,” he said.
Nevertheless, it is a hot topic in the computer world. Francesca Parmigiani, principal research manager at Microsoft Research, told CNET that optical chips could one day work alongside traditional GPUs to help AI systems perform certain tasks.
“Optical computing has the ability to perform many tasks synchronously and at higher speeds than conventional digital hardware,” Parmigiani said. “This can translate to significant gains in energy efficiency and reduced workload delays.”
Chene Tradonsky, co-founder and chief technology officer at photonics computing company LightSolver, agrees with this sentiment, saying that optical computing for AI is interesting because some important calculations can be performed quickly with very little power.
“Energy is no longer a secondary concern in AI. Power, cooling and system efficiency are becoming key constraints at the data center and global infrastructure level,” said Tradonsky. “Any technology that promises a reasonable reduction in power per computer is worth it.”
The traditional computers we use for AI are not being replaced by computer vision anytime soon. But in a few years, it is possible that optical computers can be integrated with AI systems to work with conventional computers.
“The goal is a hybrid approach: Electronics still handle general-purpose computing, memory and control, while optics can accelerate high-volume precision calculations that dominate the time and cost of AI power,” Ni said.



