A computer that uses heat instead of electricity could run algorithms that power neural networks and artificial intelligence – and tamp down their energy budgets.
“We have things like ChatGPT which can learn very complicated things about language, but it consumes an amount of energy that is absolutely crazy,” says Nicolas Brunner at the University of Geneva in Switzerland. Some estimates put ChatGPT’s daily energy consumption on par with more than 30,000 households in the US.
Most modern AI technology uses neural networks that consist of many interconnected artificial neurons, counted in billions for programs like ChatGPT, to imitate the function of the brain. The thing they don’t mimic, says Brunner, is the relatively low energy consumption of the brain.
Instead of simulating these neural connections digitally, Brunner and his colleagues developed a mathematical model for a device that would physically mimic them using qubits, or quantum bits, and heat.
They modelled how a few interacting qubits would act as neurons when connected to several thermal reservoirs that can have variable temperatures. To run calculations, you would input information not with a keyboard but by turning up the temperature on some of these reservoirs. This would make heat flow through the device, changing the quantum states and energies of the qubits, until the whole device reached a steady state. These “heat currents” act like electricity does in conventional computers. To read the computer’s output, you would check the temperature of a thermal reservoir designated to play the role of a computer monitor.
The team realised that this type of computer works similarly to a type of machine learning algorithm called a perceptron, which is the simplest neural network that can decide whether an object, like a picture of an animal, belongs to some class, such as a cat or a dog.
“If you simulate a perceptron using a conventional computer, you’re going in a roundabout way. It is very conceptually interesting and unusual to build a perceptron purely with these thermal flows,” says Marcus Huber at the Austrian Academy of Sciences in Vienna.
He says that because the laws of physics, and specifically thermodynamics, dictate that any operation by a computer must “cost” some heat and entropy, building a device where heat is part of the computation process rather than a nuisance could lead to more energy-efficient machines.
Patrick Coles at Normal Computing, a start-up focused on creating “thermodynamic AI”, says that the researchers’ conceptual framework could translate into small-scale laboratory experiments, but using it as a basis for devices that can be mass-produced may be a challenge. If heat-based perceptrons can be adapted for manufacturing with existing methods, like being made on chips, the resulting computers could be useful for generative AI and tasks like derivative pricing in finance, he says.