Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/135769
Citations
Scopus Web of Science® Altmetric
?
?
Type: Journal article
Title: Noise-Boosted Backpropagation Learning of Feedforward Threshold Neural Networks for Function Approximation
Author: Duan, L.
Duan, F.
Chapeau-Blondeau, F.
Abbott, D.
Citation: IEEE Transactions on Instrumentation and Measurement, 2021; 70:1010612-1-1010612-12
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Issue Date: 2021
ISSN: 0018-9456
1557-9662
Statement of
Responsibility: 
Lingling Duan, Fabing Duan, François Chapeau-Blondeau, and Derek Abbott, Fellow, IEEE
Abstract: Aiming to ensure the feasibility of the back propagation training of feedforward threshold neural networks, each hidden unit layer is designed to be composed of a sufficiently large number of hard-limiting activation functions that are excited by mutually independent external noise components and the weighted inputs simultaneously. The application of noise to nondifferentiable activation functions enables a proper definition of the gradients, and the injected noise is treated as a network parameter that can be adaptively updated by a stochastic gradient descent learning rule. This noise-boosted back propagation learning process is found to converge to a nonzero optimized level of noise, indicating that the injected noise is beneficial both for the learning and for the ensuing retrieval phase. For minimizing the total error energy of the function approximation in the designed threshold neural network, the proposed noise-boosted backpropagation learning method is proven to be better than directly injecting noise into network inputs or weight coefficients. The Lipschitz continuous property of the noise-smoothed activation function in the hidden unit layer is demonstrated to guarantee the local convergence of the learning process. Beyond the Gaussian injected noise, the optimal noise type is also numerically solved for training the designed threshold neural network. Test experiments for approximating nonlinear functions and real-world datasets verify the feasibility of this noise-boosted backpropagation algorithm in the threshold neural network. These results not only extend the analysis of the beneficial effects of noise similar to stochastic resonance and exploited here to the universal approximation capabilities of threshold neural networks, but also allow back propagation training of neural networks with a much wider family of nondifferentiable activation functions.
Keywords: Function approximation; noise injection; noiseboosted backpropagation; optimal noise; stochastic resonance; threshold neural network
Rights: © 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See https://www.ieee.org/publications/rights/index.html for more information.
DOI: 10.1109/TIM.2021.3121502
Grant ID: http://purl.org/au-research/grants/arc/DP200103795
Published version: http://dx.doi.org/10.1109/tim.2021.3121502
Appears in Collections:Electrical and Electronic Engineering publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.