Machine learning to improve energy efficiency in fluctuating nanosystems


May 12, 2023

(Nanowerk News) Getting something for free doesn’t work in physics. But apparently, by thinking like a strategic player, and with help from demons, increased energy efficiency for complex systems like data centers is possible.

In a computer simulation, Stephen Whitelam of the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) uses neural networks (a type of machine learning model that mimics human brain processes) to train nanosystems, which are tiny machines the size of a molecule, to work with greater energy efficiency. big.

What’s more, the simulations show that the studied protocol can draw heat from the system by constantly measuring it to find the most energy-efficient operation.

“We can take energy out of the system, or we can store work within the system,” said Whitelam.

This is insight that can prove valuable, for example, in operating very large systems such as computer data centers. Computer banks generate large amounts of heat that must be dissipated – using more energy – to prevent damage to sensitive electronics.

Whitelam conducted research at the Molecular Foundry, a DOE Office of Science user facility at Berkeley Lab. His work is described in a paper published at X Physical Review (“Demon in the Machine: Learning to Extract Work and Absorb Entropy from Nanosystem Fluctuations”).

Inspiration from Pac Man and Maxwell’s Demon

Asked about the origin of the idea, Whitelam said, “People have been using techniques in the machine learning literature to play Atari video games that seem a natural fit for materials science.”

In video games like Pac Man, he explained, the goal of machine learning is to choose a specific time for an action – up, down, left, right, and so on – to perform. Over time, the machine learning algorithm will “learn” the best move to take, and when, to achieve a high score. The same algorithm can work for nanoscale systems.

Whitelam’s simulation is also an answer to an old thought experiment in physics called Maxwell’s Demon. In short, in 1867, physicist James Clerk Maxwell proposed a box filled with gas, and in the center of the box there would be a massless “demon” controlling the trapdoor. Satan will open the door to allow faster gas molecules to move to one side of the box and slower molecules to the opposite side.

Eventually, with all the molecules separated, the “slow” side of the box will be cold and the “fast” side will be hot, according to the energy of the molecules.

Checking the refrigerator

That system would be a heat engine, said Whitelam. Importantly, though, Maxwell’s Demon doesn’t violate the laws of thermodynamics – getting something for free – because information is equivalent to energy. Measuring the position and velocity of the molecules inside the box consumes much more energy than would come from a heat generated engine.

And a heat engine can be a useful thing. The fridge provides a good analogy, says Whitelam. While the system is running, the food inside stays cold – the desired result – even if the back of the fridge gets hot from the work done by the fridge motor.

In Whitelam’s simulation, machine learning protocols can be thought of as demons. In the optimization process, it converts the information retrieved from the system being modeled into energy as heat.

Unleashing demons on a nanoscale system

In one simulation, Whitelam optimized the process of dragging nanoscale beads through water. He modeled so-called optical traps in which a laser beam, acting like light tweezers, can hold and move the beads.

“The name of the game is: Get from here to there with as little work done on the system as possible,” said Whitelam. The beads wobble under a natural fluctuation called Brownian motion as water molecules bombard them. Whitelam showed that if these fluctuations could be measured, moving the bead could be done at the most energy efficient moment.

“Here we show that we can train a neural network demon to do something similar to Maxwell’s thought experiment but with optical traps,” he says.

Cooling computer

Whitelam extended his ideas to microelectronics and computing. He used a machine learning protocol to simulate flipping the state of a nanomagnetic bit between 0 and 1, which is a basic information erase/copy operation in computing.

“Do this again, and again. Eventually, your demon will “learn” how to turn the drill bit to absorb heat from around it,” he says. He returns to the refrigerator analogy. “You can build a computer that cools as it runs, with heat being sent elsewhere in your data center.”

Whitelam said the simulation was like a testbed for understanding concepts and ideas. “And here the idea is just to show that you can do this protocol, either at a small cost of energy, or energy sucked up at the cost of going elsewhere, using measurements that can be applied in real-life experiments,” he said.


Source link

Related Articles

Back to top button