Authors:
Allan K. Y. Wong
1
;
Wilfred W.K. Lin
1
and
Tharam S. Dillon
2
Affiliations:
1
Hong Kong Polytechnic University, Hong Kong
;
2
Faculty of Information Technology, University of Technology, Australia
Keyword(s):
Hessian-based pruning technique, Internet, Taylor series, dynamic, NNC
Related
Ontology
Subjects/Areas/Topics:
Distributed Control Systems
;
Informatics in Control, Automation and Robotics
;
Intelligent Control Systems and Optimization
;
Machine Learning in Control Applications
;
Neural Networks Based Control Systems
Abstract:
The novel Hessian-based pruning (HBP) technique to optimize the feed-forward (FF) neural network (NN) configuration in a dynamic manner is proposed. It is then used to optimize the extant NNC (Neural Network Controller) as the verification exercise. The NNC is designed for dynamic buffer tuning to eliminate buffer overflow at the user/server level. The HBP optimization process is also dynamic and operates as a renewal process within the service life expectancy of the target FF neural network. Every optimization renewal cycle works with the original NN configuration. In the cycle all the insignificant NN connections are marked and then skipped in the subsequent operation before the next optimization cycle starts. The marking and skipping operations together characterize the dynamic virtual pruning nature of the HBP. The interim optimized NN configuration produced by every HBP cycle is different, as the response to the current system dynamics. The verification results with the NNC indi
cate that the HBP technique is indeed effective because all the interim optimized/pruned NNC versions incessantly and consistently yield the same convergence precision to the original NNC predecessor, and with a shorter control cycle time.
(More)