Full Paper View Go Back
Evolutionary Training of Binary Neural Networks by Particle Swarm Optimization
Hidehiko Okada1
Section:Research Paper, Product Type: Journal-Paper
Vol.10 ,
Issue.5 , pp.15-20, Oct-2022
Online published on Oct 31, 2022
Copyright © Hidehiko Okada . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
View this paper at Google Scholar | DPI Digital Library
How to Cite this Paper
- IEEE Citation
- MLA Citation
- APA Citation
- BibTex Citation
- RIS Citation
IEEE Style Citation: Hidehiko Okada, “Evolutionary Training of Binary Neural Networks by Particle Swarm Optimization,” International Journal of Scientific Research in Computer Science and Engineering, Vol.10, Issue.5, pp.15-20, 2022.
MLA Style Citation: Hidehiko Okada "Evolutionary Training of Binary Neural Networks by Particle Swarm Optimization." International Journal of Scientific Research in Computer Science and Engineering 10.5 (2022): 15-20.
APA Style Citation: Hidehiko Okada, (2022). Evolutionary Training of Binary Neural Networks by Particle Swarm Optimization. International Journal of Scientific Research in Computer Science and Engineering, 10(5), 15-20.
BibTex Style Citation:
@article{Okada_2022,
author = {Hidehiko Okada},
title = {Evolutionary Training of Binary Neural Networks by Particle Swarm Optimization},
journal = {International Journal of Scientific Research in Computer Science and Engineering},
issue_date = {10 2022},
volume = {10},
Issue = {5},
month = {10},
year = {2022},
issn = {2347-2693},
pages = {15-20},
url = {https://www.isroset.org/journal/IJSRCSE/full_paper_view.php?paper_id=2951},
publisher = {IJCSE, Indore, INDIA},
}
RIS Style Citation:
TY - JOUR
UR - https://www.isroset.org/journal/IJSRCSE/full_paper_view.php?paper_id=2951
TI - Evolutionary Training of Binary Neural Networks by Particle Swarm Optimization
T2 - International Journal of Scientific Research in Computer Science and Engineering
AU - Hidehiko Okada
PY - 2022
DA - 2022/10/31
PB - IJCSE, Indore, INDIA
SP - 15-20
IS - 5
VL - 10
SN - 2347-2693
ER -
Abstract :
A problem with deep neural networks is that the memory size for recording a trained model becomes large. A solution to this problem is to make the parameter values binary. A challenge for the binary neural networks is that they cannot be trained by the ordinary gradient-based optimization methods. The author previously applied Evolution Strategy (ES), Genetic Algorithm (GA) and Differential Evolution (DE) to the training of binary neural networks and evaluates its performance. In this paper, the author applies Particle Swarm Optimization, an instance of swarm intelligence algorithms, and compares PSO with ES, GA and DE. The experimental results with a classification task revealed that PSO could optimize binary weights so that the trained model classified both trained and untrained data with 90%+ accuracies, but the accuracies were significantly worse than those by the three algorithms. To apply PSO to a binary optimization problem, real-valued particle position vectors need to be binarized. PSO suffers from the design of this binarization. Improving the binarization is a future research challenge.
Key-Words / Index Term :
Swarm intelligence algorithm; Particle swarm optimization; Neural network; Network quantization; Neuroevolution.
References :
[1] G. E Hinton, R. R. Salakhutdinov, “Reducing the Dimensionality of Data with Neural Networks,” Science, Vol.313, Issue.5786, pp.504-507, 2006.
[2] G. E. Hinton, S. Osindero, Y. W. Teh, “A Fast Learning Algorithm for Deep Belief Nets,” Neural Computation, Vol.18, No.7, pp.1527-1554, 2006.
[3] Y. L. Boureau, Y. L. Cun, “Sparse Feature Learning for Deep Belief Networks,” Advances in Neural Information Processing Systems, pp.1185-1192, 2008.
[4] I. Sutskever, G. E. Hinton, “Deep, Narrow Sigmoid Belief Networks are Universal Approximators,” Neural Computation, Vol.20, No.11, pp.2629-2636, 2008.
[5] Y. Bengio, “Learning Deep Architectures for AI,” Foundations and Trends in Machine Learning, Vol.2, No.1, pp.1-127, 2009.
[6] H. Larochelle, Y. Bengio, J. Louradour, P. Lamblin, “Exploring Strategies for Training Deep Neural Networks,” Journal of Machine Learning Research, Vol.10(Jan), pp.1-40, 2009.
[7] X. Glorot, Y. Bengio, “Understanding the Difficulty of Training Deep Feedforward Neural Networks,” Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, Vol.9, pp.249-256, 2010.
[8] P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio, P. A. Manzagol, “Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion,” Journal of Machine Learning Research, Vol.11(Dec), pp.3371-3408, 2010.
[9] R. Salakhutdinov, G. Hinton, “An Efficient Learning Procedure for Deep Boltzmann Machines,” Neural Computation, Vol.24, No.8, pp.1967-2006, 2012.
[10] A. Krizhevsky, I. Sutskever, G. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” Advances in Neural Information Processing Systems, Vol.25, pp.1097-1105, 2012.
[11] A. Graves, A. Mohamed, G. Hinton, “Speech Recognition with Deep Recurrent Neural Networks”. IEEE International Conference on Acoustics, Speech and Signal Processing, pp.6645-6649, 2013.
[12] Y. Bengio, A. Courville, P. Vincent, “Representation Learning: a Review and New Perspectives,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.35, No.8, pp.1798-1828, 2013.
[13] Y. LeCun, Y. Bengio, G. Hinton, “Deep Learning,” Nature, Vol.521, No.7553, pp.436-444, 2015.
[14] J. Schmidhuber, “Deep Learning in Neural Networks: an Overview,” Neural Networks, Vol.61, pp.85-117, 2015.
[15] S. Zhang, A. E. Choromanska, Y. LeCun, “Deep Learning with Elastic Averaging SGD,” Advances in Neural Information Processing Systems, pp.685-693, 2015.
[16] I. Goodfellow, Y. Bengio, A. Courville, “Deep Learning,” MIT Press, 2016.
[17] M. Courbariaux, Y. Bengio, JP. David, “BinaryConnect: Training Deep Neural Networks with Binary Weights during Propagations,” Proceedings of the 28th International Conference on Neural Information Processing Systems (NIPS’15), pp.3123–3131, 2015.
[18] X. Lin, C. Zhao, W. Pan, “Towards Accurate Binary Convolutional Neural Network,” Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS’17), pp.344–352, 2017.
[19] H. Qin, R. Gong, X. Liu, X. Bai, J. Song, N. Sebe, “Binary Neural Networks: A Survey,” Pattern Recognition, Vol.105, 107281, 2020.
[20] H. Okada, “Evolutionary Training of Binary Neural Networks by Evolution Strategy,” International Journal of Scientific Research in Computer Science and Engineering, Vol.9, Issue 1, pp.31–35, 2021.
[21] H. Okada, “Evolutionary Training of Binary Neural Networks by Genetic Algorithm,” International Journal of Scientific Research in Computer Science and Engineering, Vol.9, Issue 6, pp.64–69, 2021.
[22] H. Okada, “Evolutionary Training of Binary Neural Networks by Differential Evolution,” International Journal of Scientific Research in Computer Science and Engineering, Vol.10, Issue 1, pp.26–31, 2022.
[23] H.P. Schwefel, “Evolution Strategies: a Family of Non-Linear Optimization Techniques based on Imitating Some Principles of Organic Evolution,” Annals of Operations Research, Vol.1, pp.165–167, 1984.
[24] H.P. Schwefel, “Evolution and Optimum Seeking,” Wiley & Sons, 1995.
[25] H.G. Beyer, H.P. Schwefel, “Evolution Strategies: a Comprehensive Introduction,” Journal Natural Computing, Vol.1, No.1, pp.3–52, 2002.
[26] D. E. Goldberg, J. H. Holland, “Genetic Algorithms and Machine Learning,” Machine Learning, Vol.3, No.2, pp.95–99, 1988.
[27] R. Storn, K. Price, “Differential Evolution – a Simple and Efficient Heuristic for Global Optimization over Continuous Spaces,” Journal of Global Optimization, Vol.11, pp.341–359, 1997.
[28] J. Kennedy, R. Eberhart, “Particle Swarm Optimization,” IEEE International Conference on Neural Networks, Vol.IV, pp.1942–1948, 1995.
You do not have rights to view the full text article.
Please contact administration for subscription to Journal or individual article.
Mail us at support@isroset.org or view contact page for more details.