Filters








627 Hits in 5.9 sec

Adversarial Black-Box Attacks on Automatic Speech Recognition Systems using Multi-Objective Evolutionary Optimization [article]

Shreya Khare, Rahul Aralikatte, Senthil Mani
<span title="2019-07-03">2019</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
In this work, we propose a framework which uses multi-objective evolutionary optimization to perform both targeted and un-targeted black-box attacks on Automatic Speech Recognition (ASR) systems.  ...  Both black-box and white-box approaches have been used to either replicate the model itself or to craft examples which cause the model to fail.  ...  Automatic Speech Recognition (ASR) systems are becoming ubiquitous with the pervasiveness of smart devices.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1811.01312v2">arXiv:1811.01312v2</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/lgbyp4nrmbav7ilpdrym2qpw5q">fatcat:lgbyp4nrmbav7ilpdrym2qpw5q</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20191023174503/https://arxiv.org/pdf/1811.01312v1.pdf" title="fulltext PDF download [not primary version]" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <span style="color: #f43e3e;">&#10033;</span> <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/57/8d/578dac909e0711257203a251e4089445e7dd0646.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1811.01312v2" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Adversarial Black-Box Attacks on Automatic Speech Recognition Systems Using Multi-Objective Evolutionary Optimization

Shreya Khare, Rahul Aralikatte, Senthil Mani
<span title="2019-09-15">2019</span> <i title="ISCA"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/trpytsxgozamtbp7emuvz2ypra" style="color: black;">Interspeech 2019</a> </i> &nbsp;
In this work, we propose a framework which uses multi-objective evolutionary optimization to perform both targeted and un-targeted blackbox attacks on Automatic Speech Recognition (ASR) systems.  ...  Both black-box and white-box approaches have been used to either replicate the model itself or to craft examples which cause the model to fail.  ...  Conclusion and Future Work In this work, we introduce an algorithm agnostic framework for attacking ASR systems using evolutionary multi-objective optimization.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.21437/interspeech.2019-2420">doi:10.21437/interspeech.2019-2420</a> <a target="_blank" rel="external noopener" href="https://dblp.org/rec/conf/interspeech/KhareAM19.html">dblp:conf/interspeech/KhareAM19</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/xag6wuxmn5a35bpugadyg6mkjq">fatcat:xag6wuxmn5a35bpugadyg6mkjq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20211208164055/https://www.isca-speech.org/archive/pdfs/interspeech_2019/khare19_interspeech.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/5b/ed/5bed8473a3025c96b35ddbdb3983497d8ece5668.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.21437/interspeech.2019-2420"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>

Adjust-free adversarial example generation in speech recognition using evolutionary multi-objective optimization under black-box condition [article]

Shoma Ishida, Satoshi Ono
<span title="2020-12-22">2020</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
This paper proposes a black-box adversarial attack method to automatic speech recognition systems.  ...  The proposed method in this paper adopts Evolutionary Multi-objective Optimization (EMO)that allows it generating robust adversarial examples under black-box scenario.  ...  CONCLUSION This paper proposes a method to generate robust adversarial examples using evolutionary multi-objective optimization for ASR systems.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2012.11138v2">arXiv:2012.11138v2</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/6ngwwhwiqbhfzieka6mpmf5gpa">fatcat:6ngwwhwiqbhfzieka6mpmf5gpa</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20201225060820/https://arxiv.org/pdf/2012.11138v1.pdf" title="fulltext PDF download [not primary version]" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <span style="color: #f43e3e;">&#10033;</span> <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/20/92/209290d8a6f4962ba2dda3f5439c7585c47d3e38.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2012.11138v2" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Adversarial Attack and Defense on Deep Neural Network-Based Voice Processing Systems: An Overview

Xiaojiao Chen, Sheng Li, Hao Huang
<span title="2021-09-12">2021</span> <i title="MDPI AG"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/smrngspzhzce7dy6ofycrfxbim" style="color: black;">Applied Sciences</a> </i> &nbsp;
Unfortunately, recent research has shown that those systems based on deep neural networks are vulnerable to adversarial examples, which attract significant attention to VPS security.  ...  Then we provide a concise introduction to defense methods against adversarial attacks.  ...  In the attack on the black-box speech recognition system, the authors of [13] first introduced the genetic algorithm to the black-box speech recognition system.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/app11188450">doi:10.3390/app11188450</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/zjige7gepbdvnpk2i3qwyqv2oe">fatcat:zjige7gepbdvnpk2i3qwyqv2oe</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20210916230717/https://mdpi-res.com/d_attachment/applsci/applsci-11-08450/article_deploy/applsci-11-08450.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/fd/9a/fd9a1df9a79a2c38edd9d4037b82b649505df62f.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/app11188450"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> mdpi.com </button> </a>

SoK: A Modularized Approach to Study the Security of Automatic Speech Recognition Systems [article]

Yuxuan Chen, Jiangshan Zhang, Xuejing Yuan, Shengzhi Zhang, Kai Chen, Xiaofeng Wang, Shanqing Guo
<span title="2021-07-30">2021</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
With the wide use of Automatic Speech Recognition (ASR) in applications such as human machine interaction, simultaneous interpretation, audio transcription, etc., its security protection becomes increasingly  ...  More importantly, we align the research in this domain with that on security in Image Recognition System (IRS), which has been extensively studied, using the domain knowledge in the latter to help understand  ...  For example, the authors in [54] propose evolutionary multi-objective optimization approach to attack two white-box ASR systems in both un-targeted and targeted settings.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2103.10651v2">arXiv:2103.10651v2</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/ryllxp63hvgoxm5d6ef7n7l55a">fatcat:ryllxp63hvgoxm5d6ef7n7l55a</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20210805210227/https://arxiv.org/pdf/2103.10651v2.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/7c/92/7c927fc4f76049a18fe9e8dcd89a59ade70132c9.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2103.10651v2" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Characterizing Speech Adversarial Examples Using Self-Attention U-Net Enhancement [article]

Chao-Han Huck Yang, Jun Qi, Pin-Yu Chen, Xiaoli Ma, Chin-Hui Lee
<span title="2020-03-31">2020</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
We conduct experiments on the automatic speech recognition (ASR) task with adversarial audio attacks.  ...  Recent studies have highlighted adversarial examples as ubiquitous threats to the deep neural network (DNN) based speech recognition systems.  ...  [5] proposed a multi-objective evolutionary optimization method to craft adversarial examples on ASR systems instead of gradient-based approaches [4, 6] .  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2003.13917v1">arXiv:2003.13917v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/56dm5tqohbdabjk5ygdyoc5mzq">fatcat:56dm5tqohbdabjk5ygdyoc5mzq</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200402083927/https://arxiv.org/pdf/2003.13917v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2003.13917v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

aaeCAPTCHA: The Design and Implementation of Audio Adversarial CAPTCHA [article]

Md Imran Hossen, Xiali Hei
<span title="2022-03-05">2022</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
However, prior research investigating the security of audio CAPTCHAs found them highly vulnerable to automated attacks using Automatic Speech Recognition (ASR) systems.  ...  The aaeCAPTCHA system exploits audio adversarial examples as CAPTCHAs to prevent the ASR systems from automatically solving them.  ...  This work is supported in part by the US NSF under grants OIA-1946231 and CNS-2117785.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2203.02735v1">arXiv:2203.02735v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/jut63dkcyra4xbzpn6eookddom">fatcat:jut63dkcyra4xbzpn6eookddom</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220311224120/https://arxiv.org/pdf/2203.02735v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/45/df/45df8352b3fb47e8972cfd5a722b1927ed2ab602.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2203.02735v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Towards Security Threats of Deep Learning Systems: A Survey [article]

Yingzhe He and Guozhu Meng and Kai Chen and Xingbo Hu and Jinwen He
<span title="2020-10-27">2020</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
In particular, we focus on four types of attacks associated with security threats of deep learning: model extraction attack, model inversion attack, poisoning attack and adversarial attack.  ...  In order to unveil the security weaknesses and aid in the development of a robust deep learning system, we undertake an investigation on attacks towards deep learning, and analyze these attacks to conclude  ...  [158] undertook a poisoning attack towards multi-class problem based on back-gradient optimization.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1911.12562v2">arXiv:1911.12562v2</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/m3lyece44jgdbp6rlcpj6dz2gm">fatcat:m3lyece44jgdbp6rlcpj6dz2gm</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20201029232341/https://arxiv.org/pdf/1911.12562v2.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/ea/ab/eaab9a1bd445c0917f2c27a522915722b20cfd20.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1911.12562v2" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Adversarial Examples: Attacks and Defenses for Deep Learning

Xiaoyong Yuan, Pan He, Qile Zhu, Xiaolin Li
<span title="2019-01-14">2019</span> <i title="Institute of Electrical and Electronics Engineers (IEEE)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/j6amxna35bbs5p42wy5crllu2i" style="color: black;">IEEE Transactions on Neural Networks and Learning Systems</a> </i> &nbsp;
Therefore, attacks and defenses on adversarial examples draw great attention.  ...  Under the taxonomy, applications for adversarial examples are investigated. We further elaborate on countermeasures for adversarial examples.  ...  Attackers can generate adversarial commands against automatic speech recognition (ASR) models and Voice Controllable System (VCS) [23] , [24] such as Apple Siri [25] , Amazon Alexa [26] , and Microsoft  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tnnls.2018.2886017">doi:10.1109/tnnls.2018.2886017</a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pubmed/30640631">pmid:30640631</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/enznysw3svfzdjrmubwkedr6me">fatcat:enznysw3svfzdjrmubwkedr6me</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20190224171800/http://pdfs.semanticscholar.org/5b4a/beb466a2c97a99b9621e0c83c95f4326e99b.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/5b/4a/5b4abeb466a2c97a99b9621e0c83c95f4326e99b.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tnnls.2018.2886017"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

Adversarial Examples: Attacks and Defenses for Deep Learning [article]

Xiaoyong Yuan, Pan He, Qile Zhu, Xiaolin Li
<span title="2018-07-07">2018</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
Therefore, attacks and defenses on adversarial examples draw great attention.  ...  In this paper, we review recent findings on adversarial examples for deep neural networks, summarize the methods for generating adversarial examples, and propose a taxonomy of these methods.  ...  Adversarial commands can be generated by attackers against automatic speech recognition (ASR) models and Voice Controllable System (VCS) [23] , [24] such as Apple Siri [25] , Amazon Alexa [26] , and  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1712.07107v3">arXiv:1712.07107v3</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/5wcz4h4eijdsdjeqwdpzbfbjeu">fatcat:5wcz4h4eijdsdjeqwdpzbfbjeu</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20191023000222/https://arxiv.org/pdf/1712.07107v2.pdf" title="fulltext PDF download [not primary version]" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <span style="color: #f43e3e;">&#10033;</span> <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/51/74/51748ed3fc64c097ea1868507afa752390c792b6.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1712.07107v3" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Table of Contents

<span title="2020-12-01">2020</span> <i title="IEEE"> 2020 IEEE Symposium Series on Computational Intelligence (SSCI) </i> &nbsp;
for Ensemble Optimization Based on Developmental Genetic Programming Gabriela Suchoparova and Roman Neruda .......... 631 Adversarial Audio Attacks that Evade Temporal Dependency Heng Liu and Gregory  ...  Li and Dongbin Zhao .......... 1478 Edge Computing Based Smart Aquaponics Monitoring System Using Deep Learning in IoT Environment Applicability issues of Evasion-Based Adversarial Attacks and Mitigation  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/ssci47803.2020.9308155">doi:10.1109/ssci47803.2020.9308155</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/hyargfnk4vevpnooatlovxm4li">fatcat:hyargfnk4vevpnooatlovxm4li</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20210113010618/https://ieeexplore.ieee.org/ielx7/9308061/9308107/09308155.pdf?tp=&amp;arnumber=9308155&amp;isnumber=9308107&amp;ref=" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/34/66/346642a4e26800fb18a771d3cffc008c5ee1d90f.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/ssci47803.2020.9308155"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

Multi-Objective Hyperparameter Optimization – An Overview [article]

Florian Karl, Tobias Pielok, Julia Moosbauer, Florian Pfisterer, Stefan Coors, Martin Binder, Lennart Schneider, Janek Thomas, Jakob Richter, Michel Lang, Eduardo C. Garrido-Merchán, Juergen Branke (+1 others)
<span title="2022-06-15">2022</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
In this work, we introduce the reader to the basics of multi- objective hyperparameter optimization and motivate its usefulness in applied ML.  ...  , resulting in a multi-objective optimization problem.  ...  Black box optimization As there is generally no analytical expression of the general hyperparameter optimization problem, it forms a black-box function.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2206.07438v1">arXiv:2206.07438v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/2wliztfzdjdbzkrxmq6nluzqxe">fatcat:2wliztfzdjdbzkrxmq6nluzqxe</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220616182137/https://arxiv.org/pdf/2206.07438v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/3b/ea/3beabe8fb152ef5dbc94ebd1f9be2465f1f3cdf4.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2206.07438v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Advances in adversarial attacks and defenses in computer vision: A survey [article]

Naveed Akhtar, Ajmal Mian, Navid Kardan, Mubarak Shah
<span title="2021-09-02">2021</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
In [2], we reviewed the contributions made by the computer vision community in adversarial attacks on deep learning (and their defenses) until the advent of year 2018.  ...  However, it is now known that DL is vulnerable to adversarial attacks that can manipulate its predictions by introducing visually imperceptible perturbations in images and videos.  ...  [241] used transferability to fool face recognition systems in another black-box scenario.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2108.00401v2">arXiv:2108.00401v2</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/23gw74oj6bblnpbpeacpg3hq5y">fatcat:23gw74oj6bblnpbpeacpg3hq5y</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20210906192640/https://arxiv.org/pdf/2108.00401v2.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/1a/08/1a0829a7bef8ea3ecb33b55871b4498dd328ff68.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2108.00401v2" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Security and Privacy Issues in Deep Learning [article]

Ho Bae, Jaehee Jang, Dahuin Jung, Hyemi Jang, Heonseok Ha, Hyungyu Lee, Sungroh Yoon
<span title="2021-03-10">2021</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
Poisoning attacks compromise the training process by corrupting the data with malicious examples, while evasion attacks use adversarial examples to disrupt entire classification process.  ...  Security attacks can be divided based on when they occur: if an attack occurs during training, it is known as a poisoning attack, and if it occurs during inference (after training) it is termed an evasion  ...  In addition to the medical domain, proposed the first adversarial examples of automatic speech recognition.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1807.11655v4">arXiv:1807.11655v4</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/k7mizsqgrfhltktu6pf5htlmy4">fatcat:k7mizsqgrfhltktu6pf5htlmy4</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200907185659/https://arxiv.org/pdf/1807.11655v3.pdf" title="fulltext PDF download [not primary version]" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <span style="color: #f43e3e;">&#10033;</span> <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/fb/b4/fbb421e8196d7cf80ca90cf02846ef870766e532.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1807.11655v4" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Generating Adversarial Inputs Using A Black-box Differential Technique [article]

João Batista Pereira Matos Juúnior, Lucas Carvalho Cordeiro, Marcelo d'Amorim, Xiaowei Huang
<span title="2020-07-10">2020</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
Second, we compare DAEGEN with state-of-the-art black-box adversarial attack methods (simba and tremba), by adapting them to work on a differential setting.  ...  Algorithmically, DAEGEN uses a local search-based optimization algorithm to find DIAEs by iteratively perturbing an input to maximize the difference of two models on predicting the input.  ...  Both works perform targeted attacks (while DAEGEN only performs non-targeted attacks) and formulate their optimization problem as a multi-objective optimization problem, where they try to maximize the  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2007.05315v1">arXiv:2007.05315v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/vajwajbasjemvd3aydz5iaup4e">fatcat:vajwajbasjemvd3aydz5iaup4e</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200730110651/https://arxiv.org/pdf/2007.05315v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2007.05315v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>
&laquo; Previous Showing results 1 &mdash; 15 out of 627 results