Filters








47,958 Hits in 3.7 sec

Distributed Learning and Inference with Compressed Images [article]

Sudeep Katakol, Basem Elbarashy, Luis Herranz, Joost van de Weijer, Antonio M. Lopez
2021 arXiv   pre-print
Moreover, we may only have compressed images at training time but are able to use original images at inference time, or vice versa, and in such a case, the downstream model suffers from covariate shift  ...  Our method is agnostic to both the particular image compression method and the downstream task; and has the advantage of not adding additional cost to the deployed models, which is particularly important  ...  Antonio acknowledges the support of project TIN2017-88709-R (MINECO/AEI/FEDER, UE) and the ICREA Academia programme.  ... 
arXiv:2004.10497v2 fatcat:kpvj5dip2vfybkzgasngz5gl5i

Adaptive Lossless Image Data Compression Method Inferring Data Entropy by Applying Deep Neural Network

Shinichi Yamagiwa, Wenjia Yang, Koichi Wada
2022 Electronics  
The method infers an appropriate compression program in the system for each data block of the input data and achieves a good compression ratio without trying to compress the entire amount of data at once  ...  From this background, this paper proposes a method with a principal component analysis (PCA) and a deep neural network (DNN) to predict the entropy of data to be compressed.  ...  These compress the pixel colors in the image data by inferring the distribution of the colors.  ... 
doi:10.3390/electronics11040504 fatcat:glcl44i5nrhs7elju7jdazodfy

New Directions in Distributed Deep Learning: Bringing the Network at Forefront of IoT Design [article]

Kartikeya Bhardwaj, Wei Chen, Radu Marculescu
2020 arXiv   pre-print
learning algorithms, and (3) Communication-aware distributed inference.  ...  iii) Lack of network-aware deep learning algorithms for distributed inference across multiple IoT devices.  ...  Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation thereon.  ... 
arXiv:2008.10805v1 fatcat:kwdkpkt2rjcltbphdlraulqncq

Learning to Learn to Compress [article]

Nannan Zou and Honglei Zhang and Francesco Cricri and Hamed R. Tavakoli and Jani Lainema and Miska Hannuksela and Emre Aksu and Esa Rahtu
2021 arXiv   pre-print
In order to reduce the gap between training and inference conditions, we propose a new training paradigm for learned image compression, which is based on meta-learning.  ...  In this paper we present an end-to-end meta-learned system for image compression.  ...  METHODS In this section we describe in detail both the training phase and the inference phase of our end-to-end learned image compression codec L 2 C. An overview is provided in Fig. 1 .  ... 
arXiv:2007.16054v2 fatcat:gwdkybgsfvckheenw7ksk56r3u

EdgeAI: A Vision for Deep Learning in IoT Era [article]

Kartikeya Bhardwaj, Naveen Suda, Radu Marculescu
2019 arXiv   pre-print
distributed inference.  ...  Specifically, we discuss the existing directions in computation-aware deep learning and describe two new challenges in the IoT era: (1) Data-independent deployment of learning, and (2) Communication-aware  ...  the lack of Big Data at the edge, computation-aware deployment of learning models, and communication-aware distributed inference.  ... 
arXiv:1910.10356v1 fatcat:6df62csanbcldaf5q6y47wymt4

Post-Training Quantization for Cross-Platform Learned Image Compression [article]

Dailan He, Ziming Yang, Yuan Chen, Qi Zhang, Hongwei Qin, Yan Wang
2022 arXiv   pre-print
With our proposed methods, the current state-of-the-art image compression models can infer in a cross-platform consistent manner, which makes the further development and practice of learned image compression  ...  It has been witnessed that learned image compression has outperformed conventional image coding techniques and tends to be practical in industrial applications.  ...  This inconsistency is a catastrophe to establish general-purpose image compression systems, and it is almost inevitable when practicing learned image compression with floating-point arithmetic.  ... 
arXiv:2202.07513v1 fatcat:rckzoazulfcevfzclmnm2tmqle

Learning Scalable ℓ_∞-constrained Near-lossless Image Compression via Joint Lossy Image and Residual Compression [article]

Yuanchao Bai, Xianming Liu, Wangmeng Zuo, Yaowei Wang, Xiangyang Ji
2021 arXiv   pre-print
We propose a novel joint lossy image and residual compression framework for learning ℓ_∞-constrained near-lossless image compression.  ...  ., lossless image compression, we formulate the joint optimization problem of compressing both the lossy image and the original residual in terms of variational auto-encoders and solve it with end-to-end  ...  Learning-based Lossless Image Compression.  ... 
arXiv:2103.17015v1 fatcat:yxiqg4bzdnbtvmkbqrfbqibbdu

Improving Inference for Neural Image Compression [article]

Yibo Yang, Robert Bamler, Stephan Mandt
2021 arXiv   pre-print
We consider the problem of lossy image compression with deep latent variable models.  ...  State-of-the-art methods build on hierarchical variational autoencoders (VAEs) and learn inference networks to predict a compressible latent representation of each data point.  ...  Furthermore, this work was supported by the National Science Foundation under Grants 1928718, 2003237 and 2007719, and by Qualcomm. Stephan Mandt consulted for Disney and Google.  ... 
arXiv:2006.04240v4 fatcat:2c45ftyttfacfdwfmu3pshjqfi

AdaCompress

Hongshan Li, Yu Guo, Zhi Wang, Shutao Xia, Wenwu Zhu
2019 Proceedings of the 27th ACM International Conference on Multimedia - MM '19  
JPEG has been used as the de facto compression and encapsulation method before one uploads the images, due to its wide adaptation.  ...  In particular, we design an agent that adaptively chooses the compression level according to the input image's features and backend deep learning models.  ...  We tested the DRL agent's inference time and compressed file size for batches of images, and simulate the latency of uploading such compressed images.  ... 
doi:10.1145/3343031.3350874 dblp:conf/mm/LiGWX019 fatcat:tqjvtsoddbgixoykp5xhgbhybi

Towards Image Understanding from Deep Compression without Decoding [article]

Robert Torfason, Fabian Mentzer, Eirikur Agustsson, Michael Tschannen, Radu Timofte, Luc Van Gool
2018 arXiv   pre-print
Since the encoders and decoders in DNN-based compression methods are neural networks with feature-maps as internal representations of the images, we directly integrate these with architectures for image  ...  We find that inference from compressed representations is particularly advantageous compared to inference from compressed RGB images for aggressive compression rates.  ...  The encoder might learn features relevant for inference purely by training on the compression task, and can be forced to learn these features by training on the compression and inference tasks jointly.  ... 
arXiv:1803.06131v1 fatcat:2kmewh6wpbdajbl7ut2jwfdqqq

Communicate to Learn at the Edge [article]

Deniz Gunduz, David Burth Kurka, Mikolaj Jankowski, Mohammad Mohammadi Amiri, Emre Ozfatura, Sreejith Sreekumar
2020 arXiv   pre-print
In this paper, we challenge the current approach that treats these problems separately, and argue for a joint communication and learning paradigm for both the training and inference stages of edge learning  ...  Two factors that are critical for the success of ML algorithms are massive amounts of data and processing power, both of which are plentiful, yet highly distributed at the network edge.  ...  We highlight that the conventional scheme of transmitting the query images with the best possible quality (ignoring the learning task), and then applying the re-ID baseline on the reconstructed image is  ... 
arXiv:2009.13269v1 fatcat:t6dcbiwzffbrveyyfzyzstb5ea

Compressive Image Recovery Using Recurrent Generative Model [article]

Akshat Dave, Anil Kumar Vadathya, Kaushik Mitra
2017 arXiv   pre-print
We perform MAP inference with RIDE using back-propagation to the inputs and projected gradient method. We propose an entropy thresholding based approach for preserving texture in images well.  ...  Recurrent networks can model long-range dependencies in images and hence are suitable to handle global multiplexing in reconstruction from compressive imaging.  ...  GANs learn the ability to generate a plausible sample from the distribution of natural images.  ... 
arXiv:1612.04229v2 fatcat:37o4wrb6brhtbom44iffvi7v5a

2019 Index IEEE Journal on Emerging and Selected Topics in Circuits and Systems Vol. 9

2019 IEEE Journal on Emerging and Selected Topics in Circuits and Systems  
., +, JETCAS Dec. 2019 623-634 Image sensors Byzantine-Tolerant Inference in Distributed Deep Intelligent System: Chal- lenges and Opportunities.  ...  Chung, S., +, JETCAS Sept. 2019 544-561 Distributed sensors Byzantine-Tolerant Inference in Distributed Deep Intelligent System: Chal- lenges and Opportunities.  ... 
doi:10.1109/jetcas.2019.2958462 fatcat:faydtl5ymjfcxdc2nfrkgg7nxi

Structured Bayesian Compression for Deep models in mobile enabled devices for connected healthcare [article]

Sijia Chen, Bin Song, Xiaojiang Du, Nadra Guizani
2019 arXiv   pre-print
However, energy cost effectiveness and computational efficiency are important for prerequisites developing and deploying mobile-enabled devices, the mainstream trend in connected healthcare.  ...  Also, considering the Bayesian inference learning with sparsity inducing priors, only a few blocks contain weights with a widely distributed non-zero distribution among all blocks.  ...  For the experiment on HCD dataset, 144000 and 16000 images with equal numbers of images come from each class, and are used for training and testing, respectively.  ... 
arXiv:1902.05429v1 fatcat:j2jdl4y5gfaanpbmpp4m7nyjby

Learning sparse codes from compressed representations with biologically plausible local wiring constraints [article]

Kion Fallah, Adam A. Willats, Ninghao Liu, Christopher J. Rozell
2020 bioRxiv   pre-print
We show analytically and empirically that unsupervised learning of sparse representations can be performed in the compressed space despite significant local wiring constraints in compression matrices of  ...  error, and are consistent across many measures with measured macaque V1 receptive fields.  ...  Acknowledgments and Disclosure of Funding This work is partially supported by NSF CAREER award CCF-1350954.  ... 
doi:10.1101/2020.10.23.352443 fatcat:2365knlsyrhypjmdoachdxvcn4
« Previous Showing results 1 — 15 out of 47,958 results