Defence against adversarial attacks using classical and quantum-enhanced Boltzmann machines †

Kehoe, Aidan and Wittek, Peter and Xue, Yanbo and Pozas-Kerstjens, Alejandro (2021) Defence against adversarial attacks using classical and quantum-enhanced Boltzmann machines †. Machine Learning: Science and Technology, 2 (4). 045006. ISSN 2632-2153

[thumbnail of Kehoe_2021_Mach._Learn.__Sci._Technol._2_045006.pdf] Text
Kehoe_2021_Mach._Learn.__Sci._Technol._2_045006.pdf - Published Version

Download (436kB)

Abstract

We provide a robust defence to adversarial attacks on discriminative algorithms. Neural networks are naturally vulnerable to small, tailored perturbations in the input data that lead to wrong predictions. On the contrary, generative models attempt to learn the distribution underlying a dataset, making them inherently more robust to small perturbations. We use Boltzmann machines for discrimination purposes as attack-resistant classifiers, and compare them against standard state-of-the-art adversarial defences. We find improvements ranging from 5% to 72% against attacks with Boltzmann machines on the MNIST dataset. We furthermore complement the training with quantum-enhanced sampling from the D-Wave 2000Q annealer, finding results comparable with classical techniques and with marginal improvements in some cases. These results underline the relevance of probabilistic methods in constructing neural networks and highlight a novel scenario of practical relevance where quantum computers, even with limited hardware capabilities, could provide advantages over classical computers.

Item Type: Article
Subjects: Universal Eprints > Multidisciplinary
Depositing User: Managing Editor
Date Deposited: 04 Jul 2023 03:56
Last Modified: 14 Oct 2023 03:48
URI: http://journal.article2publish.com/id/eprint/2281

Actions (login required)

View Item
View Item