Bioinspired Neural Network Model Can Store Significantly More Memories

0
198
Artificial Intelligence Neural Network Brain

Revealed: The Secrets our Clients Used to Earn $3 Billion

By

The researchers found {that a} community that integrated each pairwise and set-wise connections carried out greatest and retained the best variety of reminiscences.

Researchers have developed a brand new mannequin impressed by latest organic discoveries that exhibits enhanced reminiscence efficiency. This was achieved by modifying a classical neural community.

Computer fashions play a vital position in investigating the mind’s course of of creating and retaining reminiscences and different intricate info. However, establishing such fashions is a fragile activity. The intricate interaction {of electrical} and biochemical alerts, in addition to the net of connections between neurons and different cell varieties, creates the infrastructure for reminiscences to be fashioned. Despite this, encoding the complicated biology of the mind into a pc mannequin for additional research has confirmed to be a troublesome activity because of the restricted understanding of the underlying biology of the mind.

Researchers on the Okinawa Institute of Science and Technology (OIST) have made enhancements to a broadly utilized laptop mannequin of reminiscence, often called a Hopfield community, by incorporating insights from biology. The alteration has resulted in a community that not solely higher mirrors the best way neurons and different cells are linked within the mind, but additionally has the capability to retailer considerably extra reminiscences.

The complexity added to the community is what makes it extra reasonable, says Thomas Burns, a Ph.D. scholar within the group of Professor Tomoki Fukai, who heads OIST’s Neural Coding and Brain Computing Unit.

“Why would biology have all this complexity? Memory capacity might be a reason,” Mr. Burns says.

Diagrams of Connections in Hopfield Networks

In the classical Hopfield community (left), every neuron (I, j, okay, l) is linked to the others in a pairwise method. In the modified community made by Mr. Burns and Professor Fukai, units of three or extra neurons can join concurrently. Credit: Thomas Burns (OIST)

Hopfield networks retailer reminiscences as patterns of weighted connections between totally different neurons within the system. The community is “trained” to encode these patterns, then researchers can check its reminiscence of them by presenting a sequence of blurry or incomplete patterns and seeing if the community can acknowledge them as one it already is aware of. In classical Hopfield networks, nevertheless, neurons within the mannequin reciprocally hook up with different neurons within the community to kind a sequence of what are known as “pairwise” connections.

Pairwise connections signify how two neurons join at a synapse, a connection point between two neurons in the brain. But in reality, neurons have intricate branched structures called dendrites that provide multiple points for connection, so the brain relies on a much more complex arrangement of synapses to get its cognitive jobs done. Additionally, connections between neurons are modulated by other cell types called astrocytes.

“It’s simply not realistic that only pairwise connections between neurons exist in the brain,” explains Mr. Burns. He created a modified Hopfield network in which not just pairs of neurons but sets of three, four, or more neurons could link up too, such as might occur in the brain through astrocytes and dendritic trees.

Although the new network allowed these so-called “set-wise” connections, overall it contained the same total number of connections as before. The researchers found that a network containing a mix of both pairwise and set-wise connections performed best and retained the highest number of memories. They estimate it works more than doubly as well as a traditional Hopfield network. “It turns out you actually need a combination of features in some balance,” says Mr. Burns. “You should have individual synapses, but you should also have some dendritic trees and some astrocytes.”

Hopfield networks are important for modeling brain processes, but they have powerful other uses too. For example, very similar types of networks called Transformers underlie AI-based language tools such as ChatGPT, so the improvements Mr. Burns and Professor Fukai have identified may also make such tools more robust.

Mr. Burns and his colleagues plan to continue working with their modified Hopfield networks to make them still more powerful. For example, in the brain the strengths of connections between neurons are not normally the same in both directions, so Mr. Burns wonders if this feature of asymmetry might also improve the network’s performance. Additionally, he would like to explore ways of making the network’s memories interact with each other, the way they do in the human brain. “Our memories are multifaceted and vast,” says Mr. Burns. “We still have a lot to uncover.”

Reference: “Simplicial Hopfield networks” by Thomas F Burns and Tomoki Fukai, 1 February 2023, International Conference on Learning Representations.