Generative Data Intelligence

Scientists Uncover Biological Echoes in Powerful AI Transformer Models – Decrypt

Date:

Electronic neural networks, one of the key concepts in artificial intelligence research, have drawn inspiration from biological neurons since their inception—as evidenced by their name. New research has now revealed that the influential AI transformer architecture also shares unexpected parallels with human neurobiology.

In a collaborative study, scientists propose that biological astrocyte-neuron networks could mimic transformers’ core computations. Or vice versa. The findings—jointly reported by MIT, the MIT-IBM Watson AI Lab, and Harvard Medical School—were published this week in the journal Proceedings of the National Academy of Sciences.

Astrocyte–neuron networks are networks of cells in the brain that consist of two types of cells: astrocytes and neurons. Astrocytes are cells that support and regulate neurons, which are brain cells that send and receive electrical impulses. Their activity is basically thinking. Astrocytes and neurons talk to each other using chemicals, electricity, and touch.

On the other hand, AI transformers—first introduced in 2017—are one of the base technologies behind generative systems like ChatGPT. –in fact, that’s where the “T” in GPT stands from. Unlike neural networks that process inputs sequentially, transformers can directly access all inputs via a mechanism called self-attention. This allows them to learn complex dependencies in data like text. 

The researchers focused on tripartite synapses, which are junctions where astrocytes form connections between a neuron that sends signals (presynaptic neuron) and a neuron that receives signals (postsynaptic neuron). 

Using mathematical modeling, they demonstrated how astrocytes’ integration of signals over time could provide the required spatial and temporal memory for self-attention. Their models also show that a biological transformer could be built using calcium signaling between astrocytes and neurons. TL;DR, this study explains how to build an organic transformer.

“Having remained electrically silent for over a century of brain recordings, astrocytes are one of the most abundant, yet less explored, cells in the brain,” Konstantinos Michmizos, associate professor of computer science at Rutgers University told MIT. “The potential of unleashing the computational power of the other half of our brain is enormous.”

A high-level overview of the proposed neuron-astrocyte network.

The hypothesis leverages emerging evidence that astrocytes play active roles in information processing, unlike their previously assumed housekeeping functions. It also outlines a biological basis for transformers, which can surpass traditional neural networks in faciliating tasks like generating coherent text.

The proposed biological transformers could provide new insights into human cognition if validated experimentally. However, significant gaps remain between humans and data-hungry transformer models. While transformers require massive training datasets, human brains transform experience into language organically on a modest energy budget.

Although links between neuroscience and artificial intelligence offer insight, comprehending the sheer complexity of our minds remains an immense challenge. Biological connections represent but one piece of the puzzle—unlocking the intricacies of human intelligence necessitates sustained effort across disciplines. How neural biology accomplishes near magic continues to be science’s deepest mystery.

Stay on top of crypto news, get daily updates in your inbox.

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?