Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (2)

Search Parameters:
Keywords = fast guessing entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 2052 KiB  
Article
The Need for Speed: A Fast Guessing Entropy Calculation for Deep Learning-Based SCA
by Guilherme Perin, Lichao Wu and Stjepan Picek
Algorithms 2023, 16(3), 127; https://doi.org/10.3390/a16030127 - 23 Feb 2023
Cited by 4 | Viewed by 2435
Abstract
The adoption of deep neural networks for profiling side-channel attacks opened new perspectives for leakage detection. Recent publications showed that cryptographic implementations featuring different countermeasures could be broken without feature selection or trace preprocessing. This success comes with a high price: an extensive [...] Read more.
The adoption of deep neural networks for profiling side-channel attacks opened new perspectives for leakage detection. Recent publications showed that cryptographic implementations featuring different countermeasures could be broken without feature selection or trace preprocessing. This success comes with a high price: an extensive hyperparameter search to find optimal deep learning models. As deep learning models usually suffer from overfitting due to their high fitting capacity, it is crucial to avoid over-training regimes, which require a correct number of epochs. For that, early stopping is employed as an efficient regularization method that requires a consistent validation metric. Although guessing entropy is a highly informative metric for profiling side-channel attacks, it is time-consuming, especially if computed for all epochs during training, and the number of validation traces is significantly large. This paper shows that guessing entropy can be efficiently computed during training by reducing the number of validation traces without affecting the efficiency of early stopping decisions. Our solution significantly speeds up the process, impacting the performance of the hyperparameter search and overall profiling attack. Our fast guessing entropy calculation is up to 16× faster, resulting in more hyperparameter tuning experiments and allowing security evaluators to find more efficient deep learning models. Full article
(This article belongs to the Special Issue Algorithms for Natural Computing Models)
Show Figures

Figure 1

17 pages, 1128 KiB  
Article
Entanglement in Quantum Search Database: Periodicity Variations and Counting
by Demosthenes Ellinas and Christos Konstandakis
Quantum Rep. 2022, 4(3), 221-237; https://doi.org/10.3390/quantum4030015 - 22 Jul 2022
Viewed by 1923
Abstract
Employing the single item search algorithm of N dimensional database it is shown that: First, the entanglement developed between two any-size parts of database space varies periodically during the course of searching. The periodic entanglement of the associated reduced density matrix quantified by [...] Read more.
Employing the single item search algorithm of N dimensional database it is shown that: First, the entanglement developed between two any-size parts of database space varies periodically during the course of searching. The periodic entanglement of the associated reduced density matrix quantified by several entanglement measures (linear entropy, von Neumann, Renyi), is found to vanish with period O(sqrt(N)). Second, functions of equal entanglement are shown to vary also with equal period. Both those phenomena, based on size-independent database bi-partition, manifest a general scale invariant property of entanglement in quantum search. Third, measuring the entanglement periodicity via the number of searching steps between successive canceling out, determines N, the database set cardinality, quadratically faster than ordinary counting. An operational setting that includes an Entropy observable and its quantum circuits realization is also provided for implementing fast counting. Rigging the marked item initial probability, either by initial advice or by guessing, improves hyper-quadratically the performance of those phenomena. Full article
Show Figures

Figure 1

Back to TopTop