MA-DG: Learning Features of Sequences in Different Dimensions for Min-Entropy Evaluation via 2D-CNN and Multi-Head Self-Attention | IEEE Journals & Magazine | IEEE Xplore

MA-DG: Learning Features of Sequences in Different Dimensions for Min-Entropy Evaluation via 2D-CNN and Multi-Head Self-Attention


Abstract:

In information security, random number quality is closely related to cryptographic system security; moreover, random number quality depends on the corresponding entropy s...Show More

Abstract:

In information security, random number quality is closely related to cryptographic system security; moreover, random number quality depends on the corresponding entropy source quality. Therefore, evaluating the entropy source quality is extremely important. For existing evaluation methods, the ability of statistical-based entropy estimators to extract and learn data information is weak, resulting in lower entropy evaluation accuracies for some complex entropy sources. The prediction-based (especially neural network-based) entropy estimators with machine learning techniques have strong data-fitting and feature-extracting capabilities and can more accurately estimate the entropy values of complex entropy sources. However, owing to the relatively simple architecture of 1D neural networks, the 1D neural networks used by these estimators frequently reach bottlenecks, seriously limiting the further improvement in entropy estimation accuracy. Considering the above issues, this paper innovatively proposes an entropy estimation method based on a 2D-CNN and a multi-head self-attention mechanism. First, we built the MA-DG Net model. This model converts 1D random number sequences into 2D images via the GAF and DFT methods and then uses a 2D-CNN to extract and learn feature information from 2D images while retaining the original 1D sequential feature information via a multi-head self-attention mechanism. Next, we train the model to find its optimal parameters. Finally, we test the evaluation effect of the model using simulated datasets with known min-entropy and a real-world dataset with unknown min-entropy. The results show that compared with the entropy estimators in the experiment, our model achieves the lowest average relative error in entropy estimation on the simulated dataset of only 1.03%. In the real-world dataset, our model achieves the lowest entropy estimation value, which is an average of 0.88 lower than that of the other entropy estimators in the experiment.
Page(s): 7879 - 7894
Date of Publication: 21 August 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.