Impact Statement:In the future, modern vehicles are expected to require 200–300 million lines of code for their software, leading to an increased demand for computing resources in intelli...Show More
Abstract:
The rapid deployment and low-cost inference of controller area network (CAN) bus anomaly detection models on intelligent vehicles can drive the development of the Green I...Show MoreMetadata
Impact Statement:
In the future, modern vehicles are expected to require 200–300 million lines of code for their software, leading to an increased demand for computing resources in intelligent vehicles. On the intelligent vehicle platform, the parameters and computational load of the anomaly detection model for the controller area network bus are gradually increasing, posing challenges in deploying and implementing the model in-vehicle terminals. The proposed method aims to significantly reduce the parameters and computational load of the anomaly detection model based on recurrent neural networks while ensuring performance and stability. This approach is designed to ensure sustainable development and safety of vehicles while promoting green vehicle networking.
Abstract:
The rapid deployment and low-cost inference of controller area network (CAN) bus anomaly detection models on intelligent vehicles can drive the development of the Green Internet of Vehicles. Anomaly detection on intelligent vehicles often utilizes recurrent neural network models, but computational resources for these models are limited on small platforms. Model compression is essential to ensure CAN bus security with restricted computing resources while improving model computation efficiency. However, the existence of shared cyclic units significantly constrains the compression of recurrent neural networks. In this study, we propose a structured pruning method for long short-term memory (LSTM) based on the contribution values of shared vectors. By analyzing the contribution value of each dimension of shared vectors, the weight matrix of the model is structurally pruned, and the output value of the LSTM layer is supplemented to maintain the information integrity between adjacent network...
Published in: IEEE Transactions on Artificial Intelligence ( Volume: 5, Issue: 12, December 2024)