ABSTRACT
This paper introduces a novel method of selecting the most significant filters in deep neural networks. We performed model simplification via pruning with Genetic Algorithm (GA) for trained deep networks. Pure GA has a weakness of local tuning and slow convergence, so it is not easy to produce good results for problems with large problem space such as ours. We present new ideas that overcome some of GA's weaknesses. These include efficient local optimization, as well as reducing the time of evaluation which occupies most of the running time. Additional time was saved by restricting the filters to preserve using the GLCM (Gray-Level Co-occurrence Matrix) to determine the usefulness of the filters. Ultimately, the saved time was used to perform more iterations, providing the opportunity to further optimize the network. The experimental result showed more than 95% of reduction in forward convolution computation with negligible performance degradation.
- Thang Nguyen Bui and Byung Ro Moon. 1993. Hyperplane Synthesis for Genetic Algorithms. In Proceedings of the 5th International Conference on Genetic Algorithms. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 102--109.Google Scholar
- Dumitru Erhan, Yoshua Bengio, Aaron Courville, and Pascal Vincent. 2009. Visualizing higher-layer features of a deep network. University of Montreal 1341, 3 (2009), 1.Google Scholar
- Robert M Haralick, Karthikeyan Shanmugam, and Its' Hak Dinstein. 1973. Textural features for image classification. IEEE Transactions on systems, man, and cybernetics 6 (1973), 610--621.Google ScholarCross Ref
- Jinhyun Kim, HyukGeun Choi, Hansang Yun, and Byung-Ro Moon. 2016. Measuring source code similarity by finding similar subgraph with an incremental genetic algorithm. In Proceedings of the Genetic and Evolutionary Computation Conference 2016. 925--932.Google Scholar
- Jian-Hao Luo and Jianxin Wu. 2017. An entropy-based pruning method for cnn compression. arXiv preprint arXiv:1706.05791 (2017).Google Scholar
- Ravi K Samala, Heang-Ping Chan, Lubomir M Hadjiiski, Mark A Helvie, Caleb Richter, and Kenny Cha. 2018. Evolutionary pruning of transfer learned deep convolutional neural network for breast cancer diagnosis in digital breast tomosynthesis. Physics in Medicine & Biology 63, 9 (2018), 095005.Google ScholarCross Ref
- Anas M Tahir, Muhammad EH Chowdhury, Amith Khandakar, Tawsifur Rahman, Yazan Qiblawey, Uzair Khurshid, Serkan Kiranyaz, Nabil Ibtehaz, M Sohel Rahman, Somaya Al-Maadeed, et al. 2021. COVID-19 infection localization and severity grading from chest X-ray images. Computers in biology and medicine 139 (2021), 105002.Google Scholar
Index Terms
- Evolutionary Pruning of Deep Convolutional Networks by a Memetic GA with Sped-Up Local Optimization and GLCM Energy Z-Score
Recommendations
Pruning convolutional neural networks via filter similarity analysis
AbstractDeep learning has shown excellent performance in many fields, especially image recognition and retrieval in recent years. The performance of convolutional neural networks (CNNs) is particularly outstanding. CNNs, however, are usually ...
Filter Pruning via Probabilistic Model-based Optimization for Accelerating Deep Convolutional Neural Networks
WSDM '21: Proceedings of the 14th ACM International Conference on Web Search and Data MiningAccelerating Deep Convolutional Neural Networks(CNNs) has recently received ever-increasing research focus. Among various approaches proposed in the literature, filter pruning has been regarded as a promising solution, which is due to its advantage in ...
An evolutionary constructive and pruning algorithm for artificial neural networks and its prediction applications
We propose a method for designing artificial neural networks (ANNs) for prediction problems based on an evolutionary constructive and pruning algorithm (ECPA). The proposed ECPA begins with a set of ANNs with the simplest possible structure, one hidden ...
Comments