The first physical system to learn nonlinear tasks without a traditional computer processor

Sam Dillavou, a postdoctoral fellow in the Durian Research Group in the School of Arts and Sciences, built the components of this contrastive local learning network, an analog system that is fast, low-power, scalable, and capable of learning nonlinear tasks. . Credit: Erica Moser

Scientists encounter many trade-offs trying to build and scale brain-like systems that can perform machine learning. For example, artificial neural networks are capable of learning complex language and vision tasks, but the process of training computers to perform these tasks is slow and requires a lot of power.

Training machines to learn digitally but perform tasks analogically—meaning that the input varies by a physical quantity, such as voltage—can reduce time and power, but small errors can be exacerbated by speed.

An electrical network that physics and engineering researchers from the University of Pennsylvania previously designed is more scalable because errors do not compound as the system size increases, but it is very limited in that it can only learn linear tasks, with a simple relationship between input and output.

Now, researchers have created an analog system that is fast, low-power, scalable, and capable of learning more complex tasks, including exclusive-or (XOR) relationships and nonlinear regression. This is called a local contrastive learning network; components evolve on their own based on local rules without knowledge of the larger structure.

Physics professor Douglas J. Durian compares it to how neurons in the human brain don’t know what other neurons are doing, and yet learning occurs.

“It can learn, in a machine learning sense, to perform useful tasks, similar to a computational neural network, but it’s a physical object,” says physicist Sam Dillavou, a postdoc at the Durian Research Group and first author in a paper about the system published in Proceedings of the National Academy of Sciences.

“One of the things that we’re really excited about is that, because it has no knowledge of the structure of the network, it’s very fault tolerant, it’s very robust to being done in many different ways, and we think this is a very opportunities to grow these things,” says engineering professor Marc Z. Miskin.

“I think it’s an ideal model system that we can study to understand all kinds of problems, including biological problems,” says physics professor Andrea J. Liu. It also says it could be useful in interfacing with devices that collect data that requires processing, such as cameras and microphones.

In the paper, the authors say that their self-learning system “offers a unique opportunity to study emergent learning. Compared to biological systems, including the brain, our system relies on simpler, well-understood dynamics, is precisely trainable and uses simple modular components.”

This research builds on the coupled learning framework that Liu and postdoc Menachem (Nachi) Stern created, publishing their findings in 2021. In this paradigm, a physical system that is not designed to accomplish a particular task adapts to inputs applied to learn the task, using local learning rules and no centralized processor.

Dillavou says he came to Penn specifically for this project, and he worked on translating the framework from working in simulation to working on the actual physical model, which can be done using standard circuit components.

“One of the craziest parts about this is that the thing is really teaching itself; we’re just setting it up to go,” Dillavou says. The researchers are fed only voltages as input, and then the transistors connecting the nodes update their properties based on the pairwise learning rule.

“Because the way it calculates and learns is based on physics, it’s much more interpretable,” says Miskin. “You can actually understand what it’s trying to do because you have a good handle on the underlying mechanism. This is kind of unique because a lot of other learning systems are black boxes where it’s much harder to know why the network did what it did. did.”

Durian says he hopes this “is the beginning of a big field,” noting that another postdoc in his lab, Lauren Altman, is building mechanical versions of contrastive local learning networks.

The researchers are currently working on scaling the design, and Liu says there are many questions about memory retention time, the effects of noise, the best architecture for the network, and whether there are better forms of nonlinearity.

“It’s not really clear what changes as we scale up a learning system,” says Miskin.

“If you think about a brain, there’s a huge gap between a worm with 300 neurons and a human being, and it’s not clear where these capabilities emerge, how things change as you grow. Having a physical system that you can bigger and bigger and bigger and bigger is an opportunity to actually study it.”

More information:
Sam Dillavou et al, Processor-less Machine Learning: Emergent Learning in a Nonlinear Analog Network, Proceedings of the National Academy of Sciences (2024). DOI: 10.1073/pnas.2319718121

Provided by University of Pennsylvania

citation: First physical system to learn nonlinear tasks without a traditional computer processor (2024, July 8) Retrieved July 8, 2024 from https://techxplore.com/news/2024-07-physical-nonlinear-tasks-traditional-processor .html

This document is subject to copyright. Except for any fair agreement for study or private research purposes, no part may be reproduced without written permission. The content is provided for informational purposes only.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top