Filters








8,484 Hits in 4.7 sec

GAN Compression: Efficient Architectures for Interactive Conditional GANs [article]

Muyang Li, Ji Lin, Yaoyao Ding, Zhijian Liu, Jun-Yan Zhu, Song Han
2020 arXiv   pre-print
Conditional Generative Adversarial Networks (cGANs) have enabled controllable image synthesis for many computer vision and graphics applications.  ...  Without losing image quality, we reduce the computation of CycleGAN by more than 20x and GauGAN by 9x, paving the way for interactive image synthesis. The code and demo are publicly available.  ...  Acknowledgments We thank NSF Career Award #1943349, MIT-IBM Watson AI Lab, Adobe, Intel, Samsung and AWS machine learning research award for supporting this research.  ... 
arXiv:2003.08936v3 fatcat:ng36z3k2hzbfbob5gjrvh62kiq

Towards Robust Classification with Image Quality Assessment [article]

Yeli Feng, Yiyu Cai
2020 arXiv   pre-print
Our method combines image quality assessment with knowledge distillation to detect input images that would trigger a DCCN to produce egregiously wrong results.  ...  Recent studies have shown that deep convolutional neural networks (DCNN) are vulnerable to adversarial examples and sensitive to perceptual quality as well as the acquisition condition of images.  ...  vision when quality degradation is present in the images.  ... 
arXiv:2004.06288v1 fatcat:ehtc4qyv5vgefpw7okunkzabni

LIQA: Lifelong Blind Image Quality Assessment [article]

Jianzhao Liu, Wei Zhou, Jiahua Xu, Xin Li, Shukun An, Zhibo Chen
2021 arXiv   pre-print
To address this problem, we propose a novel Lifelong blind Image Quality Assessment (LIQA) approach, targeting to achieve the lifelong learning of BIQA.  ...  Existing blind image quality assessment (BIQA) methods are mostly designed in a disposable way and cannot evolve with unseen distortions adaptively, which greatly limits the deployment and application  ...  INTRODUCTION B LIND image quality assessment (BIQA) is a challenging problem, which aims to automatically predict perceptual image quality without any information of reference images.  ... 
arXiv:2104.14115v1 fatcat:fi3we2jw2jcqjhih736r67jeqy

Editorial: Introduction to the Issue on Deep Learning for Image/Video Restoration and Compression

A. Murat Tekalp, Michele Covell, Radu Timofte, Chao Dong
2021 IEEE Journal on Selected Topics in Signal Processing  
Image/Video Restoration and Super-Resolution The paper "Degradation aware approach to image restoration using knowledge distillation" is the first journal paper on application of knowledge distillation  ...  Benefiting from a learnable ranker, RankSRGAN [25] can optimize the generative network in the direction of any image quality assessment (IQA) metrics and achieves state-of-the-art performance.  ... 
doi:10.1109/jstsp.2021.3053364 fatcat:hjo5pvw6lvgpfga2wfq4vpaq3q

Deep Distillation Recursive Network for Remote Sensing Imagery Super-Resolution

Kui Jiang, Zhongyuan Wang, Peng Yi, Junjun Jiang, Jing Xiao, Yuan Yao
2018 Remote Sensing  
In particular, for image super-resolution (SR) processing, previous CNN-based methods have led to significant improvements, when compared with shallow learning-based methods.  ...  In this study, a simple but effective CNN framework, namely deep distillation recursive network (DDRN), is presented for video satellite image SR.  ...  However, in real SR scenes, we have only LR images to be super-resolved, without the corresponding HR reference image.Therefore, we need to introduce quantitative non-reference image quality assessment  ... 
doi:10.3390/rs10111700 fatcat:a54deivd4ngjhbu4zculs5clda

Association: Remind Your GAN not to Forget [article]

Yi Gu, Jie Li, Yuting Gao, Ruoxin Chen, Chentao Wu, Feiyang Cai, Chao Wang, Zirui Zhang
2021 arXiv   pre-print
Besides, a distillation measure is added to depressively alter the efficacy of synaptic transmission, which dampens the feature reconstruction learning for new task.  ...  They fail to preserve previously acquired knowledge when adapting to new tasks.  ...  Classical conditioning refers to learning associations between a pair of stimulations, while operant conditioning refers to learning between behaviors and consequences [29] .  ... 
arXiv:2011.13553v2 fatcat:oafpcdfvabhnbgiezbrdoh6gp4

Robust Skin Disease Classification by Distilling Deep Neural Network Ensemble for the Mobile Diagnosis of Herpes Zoster

Seunghyeok Back, Seongju Lee, Sungho Shin, Yeonguk Yu, Taekyeong Yuk, Saepomi Jong, Seungjun Ryu, Kyoobin Lee
2021 IEEE Access  
To enhance robustness while retaining low computational cost, we propose a knowledge distillation from ensemble via curriculum training (KDE-CT) wherein a student network learns from a stronger teacher  ...  INDEX TERMS Biomedical image processing, convolutional neural networks, deep learning, dermatology.  ...  of low-quality images [12] .  ... 
doi:10.1109/access.2021.3054403 fatcat:kfajudklbvanzcbs5wc3immphu

No-reference quality assessment for DCT-based compressed image

Ci Wang, Minmin Shen, Chen Yao
2015 Journal of Visual Communication and Image Representation  
A blind/no-reference (NR) method is proposed in this paper for image quality assessment (IQA) of the images compressed in discrete cosine transform (DCT) domain.  ...  With some experimental results, we verify that the proposed algorithm (provided no reference image) achieves comparable efficacy to some full reference (FR) methods (provided the reference image), such  ...  compressed image and can be extracted for analysis; and (3) no-reference methods are the ones that only the compressed image is available for quality assessment.  ... 
doi:10.1016/j.jvcir.2015.01.006 fatcat:yfe3jnewgjamhp77fuzsbkqhuq

MobileDeepPill

Xiao Zeng, Kai Cao, Mi Zhang
2017 Proceedings of the 15th Annual International Conference on Mobile Systems, Applications, and Services - MobiSys '17  
Our deep learning-based pill image recognition algorithm wins the First Prize (champion) of the NIH NLM Pill Image Recognition Challenge.  ...  phones; a multi-CNNs model that collectively captures the shape, color and imprints characteristics of the pills; and a Knowledge Distillation-based deep model compression framework that significantly  ...  Multi-CNNs model trained without Knowledge Distillation.  ... 
doi:10.1145/3081333.3081336 dblp:conf/mobisys/ZengCZ17 fatcat:oao3fvceondvzl7h2yr46li7qi

Closed-Loop Memory GAN for Continual Learning [article]

Amanda Rios, Laurent Itti
2020 arXiv   pre-print
We then show that using a stochastic generator to continuously output fresh new images during training increases performance significantly further meanwhile generating quality images.  ...  Sequential learning of tasks using gradient descent leads to an unremitting decline in the accuracy of tasks for which training data is no longer available, termed catastrophic forgetting.  ...  Thus, at each generation step, images are assessed for their quality and "filtered" out if they do not correspond to the standard.  ... 
arXiv:1811.01146v3 fatcat:prjpmuvamvc7fnhkahzk3ugg3e

Deep Learning for Image Super-resolution: A Survey [article]

Zhihao Wang, Jian Chen, Steven C.H. Hoi
2020 arXiv   pre-print
Recent years have witnessed remarkable progress of image super-resolution using deep learning techniques.  ...  This article aims to provide a comprehensive survey on recent advances of image super-resolution using deep learning approaches.  ...  Image Quality Assessment Image quality refers to visual attributes of images and focuses on the perceptual assessments of viewers.  ... 
arXiv:1902.06068v2 fatcat:uequ4heufbcmjojclu2md3xh6m

Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data [article]

Gongfan Fang, Yifan Bao, Jie Song, Xinchao Wang, Donglin Xie, Chengchao Shen, Mingli Song
2021 arXiv   pre-print
Knowledge distillation (KD) aims to craft a compact student model that imitates the behavior of a pre-trained teacher in a target domain.  ...  In this paper, we attempt to tackle an ambitious task, termed as out-of-domain knowledge distillation (OOD-KD), which allows us to conduct KD using only OOD data that can be readily obtained at a very  ...  Discussion Relation to unlabeled knowledge distillation. In the literature of knowledge distillation, a slice of works also use unlabeled data for student learning.  ... 
arXiv:2110.15094v1 fatcat:q4h3i3prkfbjdiqm6yqf6hcfdi

Face Image Quality Assessment: A Literature Survey

Torsten Schlett, Christian Rathgeb, Olaf Henniger, Javier Galbally, Julian Fierrez, Christoph Busch
2022 ACM Computing Surveys  
Besides image selection, face image quality assessment can also be used in a variety of other application scenarios, which are discussed herein.  ...  This survey provides an overview of the face image quality assessment literature, which predominantly focuses on visible wavelength face image input.  ...  [43] proposed the identiication quality (IDQ) training loss and the use of knowledge distillation to train a lightweight FIQA network called łLightQNetž.  ... 
doi:10.1145/3507901 fatcat:xvs67qamgbbdtjydzekinnq62u

SAR Target Incremental Recognition Based on Hybrid Loss Function and Class-Bias Correction

Yongsheng Zhou, Shuo Zhang, Xiaokun Sun, Fei Ma, Fan Zhang
2022 Applied Sciences  
Regarding the three problems, firstly, the old sample preservation and knowledge distillation were introduced to preserve both old representative knowledge and knowledge structure.  ...  Incremental learning emerges to continuously obtain new knowledge from new data while preserving most previously learned knowledge, saving both time and storage.  ...  conditions.  ... 
doi:10.3390/app12031279 fatcat:vcdcob5w3vg7raa6yvviabwp5i

MI^2GAN: Generative Adversarial Network for Medical Image Domain Adaptation using Mutual Information Constraint [article]

Xinpeng Xie, Jiawei Chen, Yuexiang Li, Linlin Shen, Kai Ma, Yefeng Zheng
2020 arXiv   pre-print
Domain shift between medical images from multicentres is still an open question for the community, which degrades the generalization performance of deep learning models.  ...  Particularly, we disentangle the content features from domain information for both the source and translated images, and then maximize the mutual information between the disentangled content features to  ...  For comparison, the CVC images are also sent to Enc B for content feature distillation.  ... 
arXiv:2007.11180v2 fatcat:dhuqy2pezrb3lkq3msk4niariy
« Previous Showing results 1 — 15 out of 8,484 results