The success of ANNs is due to their ability to mimic simplified brain structures. Neuroscience reveals that neurons interact through several connectivity patterns, known as circuit motifs, which are crucial for processing information. However, most ANNs only model one or two of these motifs, which limits their performance in different tasks: early ANNs, such as multilayer perceptrons, organized neurons into layers that resemble synapses. Recent neural architectures continue to draw inspiration from biological nervous systems, but lack the complex connectivity found in the brain, such as local density and global sparsity. Incorporating this knowledge could improve the design and efficiency of ANNs.
Researchers at Microsoft Research Asia introduced CircuitNet, a neural network inspired by neural circuit architectures. CircuitNet’s core unit, the Circuit Motif Unit (CMU), consists of densely connected neurons capable of modeling diverse circuit motifs. Unlike traditional feedforward networks, CircuitNet incorporates lateral and feedback connections, following the locally dense and globally sparse structure of the brain. Experiments show that CircuitNet, with fewer parameters, outperforms popular neural networks in function approximation, image classification, reinforcement learning, and time series forecasting. This work highlights the benefits of incorporating neuroscience principles into the design of deep learning models.
Previous neural network designs often mimic biological neural structures. Early models such as single- and multi-layer perceptrons were inspired by simplified neural signaling. CNNs and RNNs were based on the brain's visual and sequential processing, respectively. Other innovations such as spiking neural networks and capsule networks also reflect biological processes. Key deep learning techniques include attention, dropout, and normalization mechanisms, parallel neural functions such as selective attention, and neural activation patterns. These approaches have achieved significant success, but they cannot generically model complex combinations of neural circuits, unlike the proposed CircuitNet.
Circuit Neural Network (CircuitNet) models signal transmission between neurons within the cerebral cortex measurement units to support diverse circuit motifs such as feedback, mutual, feed-forward, and lateral connections. Signal interactions are modeled utilizing linear transformations, neural attention, and neuronal pair products, enabling CircuitNet to capture complex neural patterns. Neurons are organized into locally dense and globally sparse cerebral cortex measurement units, interconnected via input/output ports, facilitating signal transmission within and between units. CircuitNet is suited to a variety of tasks including reinforcement learning, image classification, and time series forecasting, and functions as a general neural network architecture.
The paper presents experimental results and analysis of CircuitNet on various tasks, comparing it to baseline models. While the primary goal was not to outperform state-of-the-art models, comparisons are made for context. The results show that CircuitNet demonstrates superior function approximation, faster convergence, and better performance on deep reinforcement learning, image classification, and time series forecasting tasks. In particular, CircuitNet outperforms traditional MLPs and achieves comparable or better results than other state-of-the-art models such as ResNet, ViT, and Transformers, with fewer parameters and computational resources.
In conclusion, CircuitNet is a neural network architecture inspired by the neural circuits in the brain. CircuitNet uses CMUs, groups of densely connected neurons, as its building blocks capable of modeling diverse circuit motifs. The network structure reflects the locally dense and globally sparse connectivity of the brain. Experimental results show that CircuitNet outperforms traditional neural networks such as MLPs, CNNs, RNNs, and Transformers in various tasks including function approximation, reinforcement learning, image classification, and time series forecasting. Future work will focus on refining the architecture and enhancing its capabilities with advanced techniques.
Take a look at the PaperAll credit for this research goes to the researchers of this project. Also, don't forget to follow us on twitter.com/Marktechpost”>twitter and join our Telegram Channel and LinkedIn GrAbove!. If you like our work, you will love our fact sheet..
Don't forget to join our SubReddit of over 50,000 ml
Below is a highly recommended webinar from our sponsor: ai/webinar-nvidia-nims-and-haystack?utm_campaign=2409-campaign-nvidia-nims-and-haystack-&utm_source=marktechpost&utm_medium=banner-ad-desktop” target=”_blank” rel=”noreferrer noopener”>'Developing High-Performance ai Applications with NVIDIA NIM and Haystack'
Sana Hassan, a Consulting Intern at Marktechpost and a dual degree student at IIT Madras, is passionate about applying technology and ai to address real-world challenges. With a keen interest in solving practical problems, she brings a fresh perspective to the intersection of ai and real-life solutions.
<script async src="//platform.twitter.com/widgets.js” charset=”utf-8″>