A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Switchable Whitening for Deep Representation Learning
[article]
2019
arXiv
pre-print
Unlike existing works that design normalization techniques for specific tasks, we propose Switchable Whitening (SW), which provides a general form unifying different whitening methods as well as standardization ...
For example, without bells and whistles, we achieve state-of-the-art performance with 45.33% mIoU on the ADE20K dataset. Code is available at https://github.com/XingangPan/Switchable-Whitening. ...
Overall, our contributions are summarized as follows. (1) We propose Switchable Whitening (SW), which unifies existing whitening and standardization methods in a general form and learns to switch among ...
arXiv:1904.09739v4
fatcat:ebksyentlvb63g75gnwa6ihea4
Normalization Techniques in Training DNNs: Methodology, Analysis and Application
[article]
2020
arXiv
pre-print
Normalization techniques are essential for accelerating the training and improving the generalization of deep neural networks (DNNs), and have successfully been used in various applications. ...
In doing so, we provide insight for designing new normalization technique. ...
While their deep and complex structure provides them powerful representation capacity and appealing advantages in learning feature hierarchies, it also makes their training difficult [3] , [4] . ...
arXiv:2009.12836v1
fatcat:fei3jdfm2rajfdzqdmjghmmjsq
Group Whitening: Balancing Learning Efficiency and Representational Capacity
[article]
2021
arXiv
pre-print
The merits of BN in improving a model's learning efficiency can be further amplified by applying whitening, while its drawbacks in estimating population statistics for inference can be avoided through ...
Batch normalization (BN) is an important technique commonly incorporated into deep learning models to perform standardization within mini-batches. ...
[20] proposed batch whitening (BW) for deep models, which performs whitening on the activations of each layer within a mini-batch. ...
arXiv:2009.13333v4
fatcat:7ahb5vtfibcpnattgwrgpw3eem
Learning to Optimize Domain Specific Normalization for Domain Generalization
[article]
2020
arXiv
pre-print
The normalization statistics are kept track of separately for each normalization type if necessary. ...
Our approach employs multiple normalization methods while learning separate affine parameters per domain. ...
Note that whitening is also performed for each domain. ...
arXiv:1907.04275v3
fatcat:4lrn5su73fa7jjl47o52jnggre
An Investigation into the Stochasticity of Batch Whitening
[article]
2020
arXiv
pre-print
A full understanding of the process has been a central target in the deep learning communities. ...
Based on our analysis, we provide a framework for designing and comparing BW algorithms in different scenarios. ...
Acknowledgement We thank Anna Hennig and Ying Hu for their help with proofreading. ...
arXiv:2003.12327v1
fatcat:7m3aoov6rvhgpfpvzxduk5dssi
Error Autocorrelation Objective Function for Improved System Modeling
[article]
2021
arXiv
pre-print
Deep learning models are trained to minimize the error between the model's output and the actual values. ...
We also look at spatial correlation in vision autoencoders to demonstrate that the whitening objective functions lead to much better extrapolation--a property very desirable for reliable control systems ...
for results of the various deep models on an interpolation validation set.) ...
arXiv:2008.03582v2
fatcat:6ha7l7wmzbfpnjoote7zuvghzu
RobustNet: Improving Domain Generalization in Urban-Scene Segmentation via Instance Selective Whitening
[article]
2021
arXiv
pre-print
To address this issue, this paper proposes a novel instance selective whitening loss to improve the robustness of the segmentation networks for unseen domains. ...
Enhancing the generalization capability of deep neural networks to unseen domains is crucial for safety-critical applications in the real world such as autonomous driving. ...
to learn domain-agnostic feature representations. ...
arXiv:2103.15597v2
fatcat:3w25pgo3vff75dabqgpicr4vtu
Channel Equilibrium Networks for Learning Deep Representation
[article]
2020
arXiv
pre-print
representation. ...
However, this work shows that the combination of normalization and rectified linear function leads to inhibited channels, which have small magnitude and contribute little to the learned feature representation ...
Switchable
whitening for deep representation learning. Proceedings
of the IEEE International Conference on Computer Vi-
sion, 2019.
Pečarić, J. Power matrix means and related inequalities. ...
arXiv:2003.00214v1
fatcat:tfrfhunicffl5bvg4ur35knkzm
Semantic-Aware Domain Generalized Segmentation
[article]
2022
arXiv
pre-print
Deep models trained on source domain lack generalization when evaluated on unseen target domains with different data distributions. ...
The problem becomes even more pronounced when we have no access to target domain samples for adaptation. ...
As shown in Fig. 1 , for each backbone network, we impose SAN and SAW after stage 1 and stage 2. ...
arXiv:2204.00822v1
fatcat:r6exzrn3dzf5tjpdfw2bddxpbi
Exploiting Invariance in Training Deep Neural Networks
[article]
2021
arXiv
pre-print
The resulting algorithm requires less parameter tuning, trains well with an initial learning rate 1.0, and easily generalizes to different tasks. ...
Inspired by two basic mechanisms in animal visual systems, we introduce a feature transform technique that imposes invariance properties in the training of deep neural networks. ...
We compare our results with IterNorm and Switchable Whitening(SW) in various settings according to their code availability. ...
arXiv:2103.16634v2
fatcat:qrswp3arlffxti4rvuph6tud6e
Differentiable Learning-to-Normalize via Switchable Normalization
[article]
2019
arXiv
pre-print
We address a learning-to-normalize problem by proposing Switchable Normalization (SN), which learns to select different normalizers for different normalization layers of a deep neural network. ...
We hope SN will help ease the usage and understand the normalization techniques in deep learning. The code of SN has been made available in https://github.com/switchablenorms/. ...
., 2019; Luo, 2017a; b) , such as sparse SN and switchable whitening (Pan et al., 2019) . (8, 32) . ...
arXiv:1806.10779v5
fatcat:rwgbvjsp3rc3pblj4rezzsyb3i
Do Normalization Layers in a Deep ConvNet Really Need to Be Distinct?
[article]
2018
arXiv
pre-print
This work investigates a perspective for deep learning: whether different normalization layers in a ConvNet require different normalizers. ...
We allow each convolutional layer to be stacked before a switchable normalization (SN) that learns to choose a normalizer from a pool of normalization methods. ...
Many convolutional layers are required to learn functions of a deep teacher. ...
arXiv:1811.07727v1
fatcat:4uzmszzmgzgsnh7a6y7lzzymaa
Delving into the Estimation Shift of Batch Normalization in a Network
[article]
2022
arXiv
pre-print
Batch normalization (BN) is a milestone technique in deep learning. It normalizes the activation using mini-batch statistics during training but the estimated population statistics during inference. ...
Our primary observation is that the estimation shift can be accumulated due to the stack of BN in a network, which has detriment effects for the test performance. ...
We compare our method to batch whitening (BW) [14] , switchable whitening (SW) [34] and group whitening (GW) [15] . Note that the ResNet-50 (ResNet-101) with GW used in paper [15] Table A4 . ...
arXiv:2203.10778v1
fatcat:2vrr73fxrfhx5bkrr2g3bcysp4
Towards Understanding Regularization in Batch Normalization
[article]
2019
arXiv
pre-print
Second, learning dynamics of BN and the regularization show that training converged with large maximum and effective learning rate. ...
Switchable whitening for deep representation learning. In arXiv:1904.09739, 2019. Salah Rifai, Xavier Glorot, Yoshua Bengio, and Pascal Vincent. ...
(Luo, 2017b;a), switchable normalization (Luo et al., 2019; 2018; Shao et al., 2019), and switchable whitening (Pan et al., 2019). Anders Krogh and John A. Hertz. ...
arXiv:1809.00846v4
fatcat:xgqkyvrbpnao5fhi4r7qvdc224
Table of Contents
2019
2019 IEEE/CVF International Conference on Computer Vision (ICCV)
Switchable Whitening for Deep Representation Learning 1863 Xingang Pan (The Chinese University of HongKong), Xiaohang Zhan (The Chinese University of HongKong), JianpingShi (Sensetime Group Limited), ...
Switzerland) ERL-Net: Entangled Representation Learning for Single Image De-Raining 5643 Guoqing Wang (UNSW; Csiro Data61), Changming Sun (CSIRO Data61), and Arcot Sowmya (UNSW) Perceptual Deep Depth Super-Resolution ...
doi:10.1109/iccv.2019.00004
fatcat:5aouo4scprc75c7zetsimylj2y
« Previous
Showing results 1 — 15 out of 29 results