Connect with us

Computing

Electricity Helps Find Materials That Can “Learn”

Published

 on

A team of scientists at Argonne National Laboratory were able to observe a nonliving material mimic behavior associated with learning, which they say can lead to better artificial intelligence (AI) systems.

The paper describing the study was published in Advanced Intelligent Systems.

The group is aiming to develop the next generation of supercomputers and looking toward the human brain for inspiration.

Non-Biological Materials With Learning-Like Behaviors

Researchers looking to make brain-inspired computers often turn to non-biological materials that hint they could take up learning-like behaviors. These materials could be used to build hardware that could be paired with new software algorithms, enabling more energy-efficient AI.

The new study was led by scientists from Purdue University. They exposed oxygen deficient nickel oxide to brief electrical pulses and elicited two different electrical responses similar to learning. According to Rutgers University professor Shriram Ramanathan, who was a professor at Purdue University at the time of the work, they came up with an all-electrically-driven system that demonstrated learning behaviors.

The research team relied on the resources of the Advanced Photon Source (APS), a U.S. Department of Energy (DOE) Office of Science facility at DOE’s Argonne National Laboratory.

Habituation and Sensitization

The first response that occurs is habituation, which takes place when the material gets accustomed to being slightly zapped. Although the material’s resistance increases after an initial jolt, the researchers noted that it becomes used to the electric stimulus.

Fanny Rodolakis is a physicist and beamline scientist at the APS.

“Habituation is like what happens when you live near an airport,” Rodolakis says. “The day you move in, you think ‘what a racket,’ but eventually you hardly notice anymore.”

The second response shown by the material is sensitization, which occurs when a larger dose of electricity is administered.

“With a larger stimulus, the material’s response grows instead of diminishing over time,” Rodolakis says. “It’s akin to watching a scary movie, and then having someone say ‘boo!’ from behind a corner — you see it really jump.”

“Pretty much all living organisms demonstrate these two characteristics,” Ramanathan continues. “They really are a foundational aspect of intelligence.”

The two behaviors are controlled by quantum interactions that take place between electrons. These interactions can’t be described by classical physics, and they play a role in forming the basis for a phase transition in the material.

“An example of a phase transition is a liquid becoming a solid,” Rodolakis says. “The material we’re looking at is right on the border, and the competing interactions that are going on at the electronic level can easily be tipped one way or another by small stimuli.”

According to Ramanathan, it is essential to have a system that can be completely controlled by electrical signals.

“Being able to manipulate materials in this fashion will allow hardware to take on some of the responsibility for intelligence,” he says. “Using quantum properties to get intelligence into hardware represents a key step towards energy-efficient computing.”

Overcoming Stability-Plasticity Dilemma

Scientists can use the difference between habituation and sensitization to overcome the stability-plasticity dilemma, which is a major challenge in the development of AI. Algorithms often struggle to adapt to new information, and when they do, they often forget some of their previous experiences or what they’ve learned. If scientists create a material that can habituate, they can teach it to ignore or forget unnecessary information and achieve additional stability. On the other hand, sensitization could train the system to remember and incorporate new information, which enables plasticity.

“AI often has a hard time learning and storing new information without overwriting information that has already been stored,” Rodolakis says. “Too much stability prevents AI from learning, but too much plasticity can lead to catastrophic forgetting.”

According to the team, one of the big advantages of the new study involved the small size of the nickel oxide device.

“This type of learning had previously not been done in the current generation of electronics without a large number of transistors,” Rodolakis explains. “The single junction system is the smallest system to date to show these properties, which has big implications for the possible development of neuromorphic circuitry.”

Alex McFarland is a Brazil-based writer who covers the latest developments in artificial intelligence & blockchain. He has worked with top AI companies and publications across the globe.