Filters








6,459 Hits in 7.1 sec

Distillation of neural network models for detection and description of key points of images [article]

A.V. Yashchenko, A.V. Belikov, M.V. Peterson, A.S. Potapov
2020 arXiv   pre-print
Thus, it is important to increase the speed of neural network models for the detection and description of key points.  ...  A method for the distillation of neural networks for the task of detecting and describing key points was tested.  ...  A method for the distillation of neural networks for the task of detecting and describing key points was tested.  ... 
arXiv:2006.10502v1 fatcat:oicv43ahhbcjhf6yclzroyhjyy

Distillation of neural network models for detection and description of image key points

A.V. Yashchenko, A.V. Belikov, M.V. Peterson, A.S. Potapov
2020 Naučno-tehničeskij Vestnik Informacionnyh Tehnologij, Mehaniki i Optiki  
При информационном поиске recall = TP/(TP + FN), для ключевых точек обозначим recall(points(I 1 ), points(I 2 )) = (points(I 1 ) ∩ points(I 2 ))/points(I 2 ), т. е. доля точек на I 2 воспроизведенная на  ... 
doi:10.17586/2226-1494-2020-20-3-402-409 fatcat:3maepgzionfedlnqxm4ozakjky

A Survey on Adversarial Examples in Deep Learning

Kai Chen, Haoqi Zhu, Leiming Yan, Jinwei Wang
2020 Journal on Big Data  
training method, distillation method, etc., which application scenarios and deficiencies of different defense measures are pointed out.  ...  This article explains the key technologies and theories of adversarial examples from the concept of adversarial examples, the occurrences of the adversarial examples, the attacking methods of adversarial  ...  Papernot proposed a distillation defense mechanism against adversarial examples for deep neural networks, and verified the effectiveness of their defense mechanisms on two types of deep neural networks  ... 
doi:10.32604/jbd.2020.012294 fatcat:2o5bsbqyurfj7igqyqbs47fjre

Fake News Detection via Knowledge-driven Multimodal Graph Convolutional Networks

Youze Wang, Shengsheng Qian, Jun Hu, Quan Fang, Changsheng Xu
2020 Proceedings of the 2020 International Conference on Multimedia Retrieval  
In Figure 3(b) , the attached image looks quite normal, but the corresponding textual description is surprising and seems to be impossible, and our model can utilize knowledge distillation to obtain knowledge  ...  For each post, we may obtain several images and the YOLOv3 detector may detect several semantic objects in each image.  ... 
doi:10.1145/3372278.3390713 dblp:conf/mir/WangQHFX20 fatcat:bdtdwo3pwbhm5bbipy7w3x6idi

Surface Defect Segmentation Algorithm of Steel Plate Based on Geometric Median Filter Pruning

Zhiqiang Hao, Zhigang Wang, Dongxu Bai, Xiliang Tong
2022 Frontiers in Bioengineering and Biotechnology  
This paper focuses on the analysis of different existing deep learning model compression algorithms and proposes a model pruning algorithm based on geometric median filtering for structured pruning and  ...  compression of defect segmentation detection networks on the basis of structured pruning.  ...  These factors make feature extraction and detection segmentation of the network difficult and lead to the need for pruning and compression of the defect detection model.  ... 
doi:10.3389/fbioe.2022.945248 pmid:35845429 pmcid:PMC9283705 fatcat:ikzob26vmnhf7lqx6ihlrvxhqa

Knowledge Distillation of Grassmann Manifold Network for Remote Sensing Scene Classification

Ling Tian, Zhichao Wang, Bokun He, Chu He, Dingwen Wang, Deshi Li
2021 Remote Sensing  
Inspired by knowledge distillation, we use the information learned from convolutional neural networks to guide the training of the manifold networks.  ...  Due to device limitations, small networks are necessary for some real-world scenarios, such as satellites and micro-robots.  ...  [16] proposed a GLDBS model to learn global and local information from the original image and the key location.  ... 
doi:10.3390/rs13224537 fatcat:dzl273zgebblxc75iqfzj4bogu

A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models [article]

Jeong-Hoe Ku, JiHun Oh, YoungYoon Lee, Gaurav Pooniwala, SangJeong Lee
2020 arXiv   pre-print
This paper aims to provide a selective survey about knowledge distillation(KD) framework for researchers and practitioners to take advantage of it for developing new optimized models in the deep neural  ...  network field.  ...  Distilled Knowledge and Loss Despite the recent advances of knowledge distillation technique, a clear understanding of where knowledge resides in a deep neural network and an optimal method for capturing  ... 
arXiv:2011.14554v1 fatcat:46rruchrlbgfxexsyvujsgprou

FastSal: a Computationally Efficient Network for Visual Saliency Prediction [article]

Feiyan Hu, Kevin McGuinness
2020 arXiv   pre-print
We modify and test various recent efficient convolutional neural network architectures like EfficientNet and MobileNetV2 and compare them with existing state-of-the-art saliency models such as SalGAN and  ...  We find that MobileNetV2 makes an excellent backbone for a visual saliency model and can be effective even without a complex decoder.  ...  ACKNOWLEDGMENT This publication has emanated from research conducted with the financial support of Science Foundation Ireland (SFI) under grant number SFI/15/SIRG/3283 and SFI/12/RC/2289 P2.  ... 
arXiv:2008.11151v1 fatcat:k4wgqmcd7bfsnov6x6svp3tqfu

FastSal: a Computationally Efficient Network for Visual Saliency Prediction

Feiyan Hu, Kevin McGuinness
2021 2020 25th International Conference on Pattern Recognition (ICPR)  
We modify and test various recent efficient convolutional neural network architectures like EfficientNet and MobileNetV2 and compare them with existing state-of-the-art saliency models such as SalGAN and  ...  We find that MobileNetV2 makes an excellent backbone for a visual saliency model and can be effective even without a complex decoder.  ...  ACKNOWLEDGMENT This publication has emanated from research conducted with the financial support of Science Foundation Ireland (SFI) under grant number SFI/15/SIRG/3283 and SFI/12/RC/2289 P2.  ... 
doi:10.1109/icpr48806.2021.9413057 fatcat:aqo2m2egwvcfvpdjpw5el4x47m

Using Deep Neural Networks for Human Fall Detection Based on Pose Estimation

Mohammadamin Salimi, José J. M. Machado, João Manuel R. S. Tavares
2022 Sensors  
The solution uses Time-Distributed Convolutional Long Short-Term Memory (TD-CNN-LSTM) and 1Dimentional Convolutional Neural Network (1D-CNN) models, to classify the data extracted from image frames, and  ...  Requests for caring for and monitoring the health and safety of older adults are increasing nowadays and form a topic of great social interest.  ...  Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/s22124544 pmid:35746325 pmcid:PMC9229309 fatcat:lrmwzqov7nbmfa3lenctcjjjgu

A Method Based on Knowledge Distillation for Fish School Stress State Recognition in Intensive Aquaculture

Siyuan Mei, Yingyi Chen, Hanxiang Qin, Huihui Yu, Daoliang Li, Boyang Sun, Ling Yang, Yeqi Liu
2022 CMES - Computer Modeling in Engineering & Sciences  
The proposed model has about 5.18 M parameters and requires 0.15 G FLOPs (floating-point operations) to process an image of size 224 × 224.  ...  The experimental results show that the proposed model is more suitable for deployment on resource-constrained devices or real-time applications, and it is conducive for real-time monitoring of fish behavior  ...  Furthermore, we are grateful to Professor Yanqing Duan for her proofreading and guidance on the whole article, who is from the University of Bedfordshire in the UK.  ... 
doi:10.32604/cmes.2022.019378 fatcat:gctsri57gbgirfddeqnvnfftcu

Towards Evaluating the Robustness of Neural Networks [article]

Nicholas Carlini, David Wagner
2017 arXiv   pre-print
Neural networks provide state-of-the-art results for most machine learning tasks.  ...  Defensive distillation is a recently proposed approach that can take an arbitrary neural network, and increase its robustness, reducing the success rate of current attacks' ability to find adversarial  ...  ACKNOWLEDGEMENTS We would like to thank Nicolas Papernot discussing our defensive distillation implementation, and the anonymous reviewers for their helpful feedback.  ... 
arXiv:1608.04644v2 fatcat:5vvgklqhovctneuna3ho3mgkte

Towards Evaluating the Robustness of Neural Networks

Nicholas Carlini, David Wagner
2017 2017 IEEE Symposium on Security and Privacy (SP)  
Neural networks provide state-of-the-art results for most machine learning tasks.  ...  Defensive distillation is a recently proposed approach that can take an arbitrary neural network, and increase its robustness, reducing the success rate of current attacks' ability to find adversarial  ...  ACKNOWLEDGEMENTS We would like to thank Nicolas Papernot discussing our defensive distillation implementation, and the anonymous reviewers for their helpful feedback.  ... 
doi:10.1109/sp.2017.49 dblp:conf/sp/Carlini017 fatcat:wzvnhpyq3nc2dlmary26lvhwey

Restricted Region based Iterative Gradient Method for Non-targeted Attack

Zhaoquan Gu, Weixiong Hu, Chuanjing Zhang, Le Wang, Chunsheng Zhu, Zhihong Tian
2020 IEEE Access  
RESTRICT REGION We use the object detection algorithm to find the key region of a image.  ...  networks (such as image classification and object detection).  ...  nerability of current black-box neural networks and we need to pay more attention to the robustness of neural networks.  ... 
doi:10.1109/access.2020.2971004 fatcat:7x2g4nustzhijnlhibco4c6fva

Extending Detection with Privileged Information via Generalized Distillation

Z. Berkay Celik, Patrick McDaniel
2018 2018 IEEE Security and Privacy Workshops (SPW)  
We use a deep neural network to implement generalized distillation for the training of detection models and making predictions.  ...  Detection systems based on machine learning models are essential tools for system and enterprise defense.  ...  David Lopez-Paz for his constructive comments on generalized distillation and Dr. Rauf Izmailov for his feedback on the application of Learning using Privileged Information (LUPI) paradigm.  ... 
doi:10.1109/spw.2018.00021 dblp:conf/sp/CelikM18 fatcat:7qlofwwetfaprg3ugvkgff4mei
« Previous Showing results 1 — 15 out of 6,459 results