Gradient-Based Neuroplastic Adaptation for Concurrent Optimization of Neuro-Fuzzy Networks
- URL: http://arxiv.org/abs/2506.21771v1
- Date: Thu, 26 Jun 2025 21:08:11 GMT
- Title: Gradient-Based Neuroplastic Adaptation for Concurrent Optimization of Neuro-Fuzzy Networks
- Authors: John Wesley Hostetter, Min Chi,
- Abstract summary: Neuro-fuzzy networks (NFNs) are transparent, symbolic universal function approximations that perform as well as conventional neural architectures.<n>We propose an application-independent approach called neuroplastic adaptation for the concurrent parameters and structure of NFNs.
- Score: 7.811459544911894
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuro-fuzzy networks (NFNs) are transparent, symbolic, and universal function approximations that perform as well as conventional neural architectures, but their knowledge is expressed as linguistic IF-THEN rules. Despite these advantages, their systematic design process remains a challenge. Existing work will often sequentially build NFNs by inefficiently isolating parametric and structural identification, leading to a premature commitment to brittle and subpar architecture. We propose a novel application-independent approach called gradient-based neuroplastic adaptation for the concurrent optimization of NFNs' parameters and structure. By recognizing that NFNs' parameters and structure should be optimized simultaneously as they are deeply conjoined, settings previously unapproachable for NFNs are now accessible, such as the online reinforcement learning of NFNs for vision-based tasks. The effectiveness of concurrently optimizing NFNs is empirically shown as it is trained by online reinforcement learning to proficiently play challenging scenarios from a vision-based video game called DOOM.
Related papers
- Compositional Function Networks: A High-Performance Alternative to Deep Neural Networks with Built-in Interpretability [3.8126669848415666]
We introduce Compositional Function Networks (CFNs), a novel framework that builds inherently interpretable models.<n>CFNs support diverse compositional patterns, enabling complex feature interactions while maintaining transparency.<n>We demonstrate CFNs' versatility across multiple domains, from symbolic regression to image classification with deep hierarchical networks.
arXiv Detail & Related papers (2025-07-28T17:18:40Z) - Equivariant Polynomial Functional Networks [2.3963215252605172]
A key design principle of Neural Functional Networks (NFNs) is their adherence to the permutation and scaling symmetries inherent in the connectionist structure of the input neural networks.
Recent NFNs have been proposed with permutation and scaling equivariance based on either graph-based message-passing mechanisms or parameter-sharing mechanisms.
We propose a novel solution with the development of MAG-NFN (Monomial mAtrix Group Equivariant Polynomial NFN)
arXiv Detail & Related papers (2024-10-05T16:10:19Z) - Advancing Spatio-Temporal Processing in Spiking Neural Networks through Adaptation [6.233189707488025]
neural networks on neuromorphic hardware promise orders of less power consumption than their non-spiking counterparts.<n>Standard neuron model for spike-based computation on such systems has long been the integrate-and-fire (LIF) neuron.<n>The root of these so-called adaptive LIF neurons is not well understood.
arXiv Detail & Related papers (2024-08-14T12:49:58Z) - Neural Parameter Regression for Explicit Representations of PDE Solution Operators [22.355460388065964]
We introduce Neural Regression (NPR), a novel framework specifically developed for learning solution operators in Partial Differential Equations (PDEs)
NPR employs Physics-Informed Neural Network (PINN, Raissi et al., 2021) techniques to regress Neural Network (NN) parameters.
The framework shows remarkable adaptability to new initial and boundary conditions, allowing for rapid fine-tuning and inference.
arXiv Detail & Related papers (2024-03-19T14:30:56Z) - N-Adaptive Ritz Method: A Neural Network Enriched Partition of Unity for
Boundary Value Problems [1.2200609701777907]
This work introduces a novel neural network-enriched Partition of Unity (NN-PU) approach for solving boundary value problems via artificial neural networks.
The NN enrichment is constructed by combining pre-trained feature-encoded NN blocks with an untrained NN block.
The proposed method offers accurate solutions while notably reducing the computational cost compared to the conventional adaptive refinement in the mesh-based methods.
arXiv Detail & Related papers (2024-01-16T18:11:14Z) - Efficient and Flexible Neural Network Training through Layer-wise Feedback Propagation [49.44309457870649]
Layer-wise Feedback feedback (LFP) is a novel training principle for neural network-like predictors.<n>LFP decomposes a reward to individual neurons based on their respective contributions.<n>Our method then implements a greedy reinforcing approach helpful parts of the network and weakening harmful ones.
arXiv Detail & Related papers (2023-08-23T10:48:28Z) - Learning to Control Rapidly Changing Synaptic Connections: An
Alternative Type of Memory in Sequence Processing Artificial Neural Networks [9.605853974038936]
Generalising feedforward NNs to such RNNs is mathematically straightforward and natural, and even historical.
A lesser known alternative approach to storing short-term memory in "synaptic connections" yields another "natural" type of short-term memory in sequence processing NNs.
Fast Weight Programmers (FWPs) have seen a recent revival as generic sequence processors, achieving competitive performance across various tasks.
arXiv Detail & Related papers (2022-11-17T10:03:54Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z) - Learning to Solve the AC-OPF using Sensitivity-Informed Deep Neural
Networks [52.32646357164739]
We propose a deep neural network (DNN) to solve the solutions of the optimal power flow (ACOPF)
The proposed SIDNN is compatible with a broad range of OPF schemes.
It can be seamlessly integrated in other learning-to-OPF schemes.
arXiv Detail & Related papers (2021-03-27T00:45:23Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Iterative Network for Image Super-Resolution [69.07361550998318]
Single image super-resolution (SISR) has been greatly revitalized by the recent development of convolutional neural networks (CNN)
This paper provides a new insight on conventional SISR algorithm, and proposes a substantially different approach relying on the iterative optimization.
A novel iterative super-resolution network (ISRN) is proposed on top of the iterative optimization.
arXiv Detail & Related papers (2020-05-20T11:11:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.