Filters








12,216 Hits in 5.8 sec

Distribution-Based Invariant Deep Networks for Learning Meta-Features [article]

Gwendoline De Bie, Herilalaina Rakotoarison, Gabriel Peyré, Michèle Sebag
2020 arXiv   pre-print
Recent advances in deep learning from probability distributions successfully achieve classification or regression from distribution samples, thus invariant under permutation of the samples.  ...  On both tasks, Dida learns meta-features supporting the characterization of a (labelled) dataset.  ...  Distribution-Based Invariant Networks for Meta-Feature Learning This section describes the core of the proposed distribution-based invariant neural architectures, specifically the mechanism of mapping  ... 
arXiv:2006.13708v2 fatcat:rxqidzcapzbyhgzj56hsaoqxai

Discriminative Adversarial Domain Generalization with Meta-learning based Cross-domain Validation [article]

Keyu Chen, Di Zhuang, J. Morris Chang
2020 arXiv   pre-print
representation on multiple "seen" domains, and (ii) meta-learning based cross-domain validation, which simulates train/test domain shift via applying meta-learning techniques in the training process.  ...  In this paper, we propose Discriminative Adversarial Domain Generalization (DADG) with meta-learning-based cross-domain validation.  ...  Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation thereon. 1  ... 
arXiv:2011.00444v1 fatcat:odyqjjadrbdwvd46qbel4itoxa

Learning to Learn with Variational Information Bottleneck for Domain Generalization [article]

Yingjun Du, Jun Xu, Huan Xiong, Qiang Qiu, Xiantong Zhen, Cees G. M. Snoek, Ling Shao
2020 arXiv   pre-print
We introduce a probabilistic meta-learning model for domain generalization, in which classifier parameters shared across domains are modeled as distributions.  ...  To deal with domain shift, we learn domain-invariant representations by the proposed principle of meta variational information bottleneck, we call MetaVIB.  ...  We introduce the IB principle for domain-invariant representation learning by a stochastic deep neural network.  ... 
arXiv:2007.07645v1 fatcat:m2l6kqyqwfdm5bdxrn5lajg7ia

Learning to Recommend via Meta Parameter Partition [article]

Liang Zhao, Yang Wang, Daxiang Dong, Hao Tian
2019 arXiv   pre-print
The fixed part, capturing user invariant features, is shared by all users and is learned during offline meta learning stage.  ...  The adaptive part, capturing user specific features, is learned during online meta learning stage.  ...  Deep neural network (DNN) has potential to learn expressive features for accurate CTR prediction. CNN-based models [20] are biased to the interaction between neighboring features.  ... 
arXiv:1912.04108v1 fatcat:msk7xhsmbrebliak6vp3lwrzyy

Cross Domain Adaptation of Crowd Counting with Model-Agnostic Meta-Learning

Xiaoyu Hou, Jihui Xu, Jinming Wu, Huaiyu Xu
2021 Applied Sciences  
To improve the domain adaptation in feature extraction, we propose an adaptive domain-invariant feature extracting module.  ...  CNN-based crowd-counting algorithms usually consist of feature extraction, density estimation, and count regression.  ...  Deep convolutional networks first extract feature maps from images and the density value of each feature pixel is then predicted.  ... 
doi:10.3390/app112412037 fatcat:l6lqcpggzbadrhfrhexkyzdlwa

An Optimization-Based Meta-Learning Model for MRI Reconstruction with Diverse Dataset [article]

Wanyu Bian, Yunmei Chen, Xiaojing Ye, Qingchao Zhang
2021 arXiv   pre-print
The standard benchmarks in meta-learning are challenged by learning on diverse task distributions.  ...  In this model, the learnable regularization function contains a task-invariant common feature encoder and task-specific learner represented by a shallow network.  ...  In Section 2, we introduce some related work for both optimization-based meta-learning and deep unrolled networks for MRI reconstructions.  ... 
arXiv:2110.00715v1 fatcat:bp7z3qyajzcofi5nujekpq76vu

A Bit More Bayesian: Domain-Invariant Learning with Uncertainty [article]

Zehao Xiao, Jiayi Shen, Xiantong Zhen, Ling Shao, Cees G. M. Snoek
2021 arXiv   pre-print
Ablation studies validate the synergistic benefits of our Bayesian treatment when jointly learning domain-invariant representations and classifiers for domain generalization.  ...  In this paper, we address both challenges with a probabilistic framework based on variational Bayesian inference, by incorporating uncertainty into neural network weights.  ...  Acknowledgements This work is financially supported by the Inception Institute of Artificial Intelligence, the University of Amsterdam and the allowance Top consortia for Knowledge and Innovation (TKIs  ... 
arXiv:2105.04030v3 fatcat:wetgafpoqraqzmx5lchblqzldq

Knowledge as Invariance – History and Perspectives of Knowledge-augmented Machine Learning [article]

Alexander Sagel and Amit Sahu and Stefan Matthes and Holger Pfeifer and Tianming Qiu and Harald Rueß and Hao Shen and Julian Wörmann
2020 arXiv   pre-print
Major weaknesses of present-day deep learning models are, for instance, their lack of adaptability to changes of environment or their incapability to perform other kinds of tasks than the one they were  ...  trained for.  ...  Weight sharing in convolutional layers of deep networks without doubt provides a strong prior for the most common deep learning applications [74] .  ... 
arXiv:2012.11406v1 fatcat:nnbnsrwfr5fbxg4qj3a4cfajk4

Domain Generalization with Optimal Transport and Metric Learning [article]

Fan Zhou, Zhuqing Jiang, Changjian Shui, Boyu Wang, Brahim Chaib-draa
2020 arXiv   pre-print
Previous domain generalization approaches mainly focused on learning invariant features and stacking the learned features from each source domain to generalize to a new target domain while ignoring the  ...  For this, one possible solution is to constrain the label-similarity when extracting the invariant features and to take advantage of the label similarities for class-specific cohesion and separation of  ...  [24] proposed an end-to-end deep domain generalization approach by leveraging deep neural networks for domain-invariant representation learning.  ... 
arXiv:2007.10573v1 fatcat:kqubwtaw5nf3hpvct6s5ey3hru

Meta-Learning Dynamics Forecasting Using Task Inference [article]

Rui Wang, Robin Walters, Rose Yu
2021 arXiv   pre-print
Current deep learning models for dynamics forecasting struggle with generalization.  ...  We propose a model-based meta-learning method called DyAd which can generalize across heterogeneous domains by partitioning them into different tasks.  ...  DyAd: Dynamic Adaptation Network We propose a model-based meta-learning approach for dynamics forecasting.  ... 
arXiv:2102.10271v3 fatcat:vqvloatadrgdneyrcmfgmdnv2y

Revisiting Local Descriptor based Image-to-Class Measure for Few-shot Learning [article]

Wenbin Li, Lei Wang, Jinglin Xu, Jing Huo, Yang Gao, Jiebo Luo
2019 arXiv   pre-print
Instead, we think a local descriptor based image-to-class measure should be taken, inspired by its surprising success in the heydays of local invariant features.  ...  The proposed DN4 not only learns the optimal deep local descriptors for the image-to-class measure, but also utilizes the higher efficiency of such a measure in the case of example scarcity, thanks to  ...  Meta-learning based methods.  ... 
arXiv:1903.12290v2 fatcat:yrft43dj2rbcpbvarsjdpm4uv4

Robust Compare Network for Few-Shot Learning

Yixin Yang, Yang Li, Rui Zhang, Jiabao Wang, Zhuang Miao
2020 IEEE Access  
As illustrated in Fig. 2 , our method is composed of three key modules: (1) Embedding module: a deep neural network with shift-invariant blocks and a self-attention block to learn invariance visual features  ...  On the other hand, some FSL approaches based on meta-learning follow the key idea of learning-tolearn mechanism.  ... 
doi:10.1109/access.2020.3012720 fatcat:kwgz35ngsrdbbgfg2y46gqfjpm

An Optimization-Based Meta-Learning Model for MRI Reconstruction with Diverse Dataset

Wanyu Bian, Yunmei Chen, Xiaojing Ye, Qingchao Zhang
2021 Journal of Imaging  
We partition these network parameters into two parts: a task-invariant part for the common feature encoder component of the regularization, and a task-specific part to account for the variations in the  ...  In this model, the nonconvex nonsmooth regularization term is parameterized as a structured deep network where the network parameters can be learned from data.  ...  In Section 2, we discuss related work for both optimization-based meta-learning and deep unrolled networks for MRI reconstructions.  ... 
doi:10.3390/jimaging7110231 pmid:34821862 pmcid:PMC8621471 fatcat:attame2j5rg2jef3uodbarw7xm

Learning to Warm-Start Bayesian Hyperparameter Optimization [article]

Jungtaek Kim, Saehoon Kim, Seungjin Choi
2018 arXiv   pre-print
To this end, we introduce a Siamese network composed of deep feature and meta-feature extractors, where deep feature extractor provides a semantic representation of each instance in a dataset and meta-feature  ...  In this paper, we propose deep metric learning to learn meta-features over datasets such that the similarity over them is effectively measured by Euclidean distance between their associated meta-features  ...  Feature Extractors Setup To learn a distance function using Siamese network, the network composed of deep feature extractor and meta-feature extractor are used as a wing of Siamese network.  ... 
arXiv:1710.06219v3 fatcat:nqd3zduti5aehakgnfehbmqghe

Exploiting Domain-Specific Features to Enhance Domain Generalization [article]

Manh-Ha Bui, Toan Tran, Anh Tuan Tran, Dinh Phung
2021 arXiv   pre-print
Our key insight is to disentangle features in the latent space while jointly learning both domain-invariant and domain-specific features in a unified framework.  ...  The domain-specific representation is optimized through the meta-learning framework to adapt from source domains, targeting a robust generalization on unseen domains.  ...  Distributionally robust neural networks for group shifts: On the importance of regularization for worst-case generalization, 2020.  ... 
arXiv:2110.09410v1 fatcat:koqcioubtvdk7ouacwsu3xi2he
« Previous Showing results 1 — 15 out of 12,216 results