Filters








101,619 Hits in 3.4 sec

Dual Supervised Learning [article]

Yingce Xia, Tao Qin, Wei Chen, Jiang Bian, Nenghai Yu, Tie-Yan Liu
2017 arXiv   pre-print
For ease of reference, we call the proposed approach dual supervised learning.  ...  Many supervised learning tasks are emerged in dual forms, e.g., English-to-French translation vs.  ...  We call this new learning scheme dual supervised learning (abbreviated as DSL).  ... 
arXiv:1707.00415v1 fatcat:tnaomlfynnd4fe5k6lbk5o2blu

Understanding Self-supervised Learning with Dual Deep Networks [article]

Yuandong Tian and Lantao Yu and Xinlei Chen and Surya Ganguli
2021 arXiv   pre-print
We propose a novel theoretical framework to understand contrastive self-supervised learning (SSL) methods that employ dual pairs of deep ReLU networks (e.g., SimCLR).  ...  HLTM) and prove that the hidden neurons of deep ReLU networks can learn the latent variables in HLTM, despite the fact that the network receives no direct supervision from these unobserved latent variables  ...  An analogy between self-supervised and supervised learning: the dual network scenario.  ... 
arXiv:2010.00578v6 fatcat:7l45kjpsn5bv3cxmm5vxqyzqyi

Dual Learning for Semi-Supervised Natural Language Understanding [article]

Su Zhu, Ruisheng Cao, Kai Yu
2020 arXiv   pre-print
To solve this data sparsity problem, previous work based on semi-supervised learning mainly focuses on exploiting unlabeled sentences.  ...  The framework is composed of dual pseudo-labeling and dual learning method, which enables an NLU model to make full use of data (labeled and unlabeled) through a closed-loop of the primal and dual tasks  ...  DUAL SEMI-SUPERVISED NLU In this section, we will describe our dual semi-supervised framework for the NLU task, which contains two methods: dual pseudo-labeling and dual learning.  ... 
arXiv:2004.12299v1 fatcat:ymineuzvhfdfroz7vgasqjzypm

Dual Supervised Learning for Natural Language Understanding and Generation [article]

Shang-Yu Su, Chao-Wei Huang, Yun-Nung Chen
2020 arXiv   pre-print
This paper proposes a new learning framework for language understanding and generation on top of dual supervised learning, providing a way to exploit the duality.  ...  However, such dual relationship has not been investigated in the literature.  ...  The training strategy is based on the standard supervised learning and incorporates the probability duality constraint, so-called dual supervised learning.  ... 
arXiv:1905.06196v4 fatcat:46t3sbaa6reldkfr3pwsc2rwle

Semi-supervised representation learning via dual autoencoders for domain adaptation

Shuai Yang, Hao Wang, Yuhong Zhang, Peipei Li, Yi Zhu, Xuegang Hu
2019 Knowledge-Based Systems  
To address this problem, we propose a novel Semi-Supervised Representation Learning framework via Dual Autoencoders for domain adaptation, named SSRLDA.  ...  Recently, lots of deep learning approaches based on autoencoders have achieved a significance performance in domain adaptation.  ...  To address the above issues, we propose a novel Semi-Supervised Representation Learning framework via Dual Autoencoders (SSRLDA) for domain adaptation.  ... 
doi:10.1016/j.knosys.2019.105161 fatcat:g4jqs7tgdfeb7oot44aqeqsuxa

Dual-Task Mutual Learning for Semi-Supervised Medical Image Segmentation [article]

Yichi Zhang, Jicong Zhang
2021 arXiv   pre-print
In this paper, we propose a novel dual-task mutual learning framework for semi-supervised medical image segmentation.  ...  Semi-supervised learning has attracted much attention in medical image segmentation by taking the advantage of unlabeled data which is much easier to acquire.  ...  Table 1 . 1 Comparison of different supervised loss functions for our dual-task mutual learning framework.  ... 
arXiv:2103.04708v2 fatcat:k3wscvvunjawvlm6jeqauhceia

Learning Dual Retrieval Module for Semi-supervised Relation Extraction [article]

Hongtao Lin, Jun Yan, Meng Qu, Xiang Ren
2019 arXiv   pre-print
Relation extraction is an important task in structuring content of text data, and becomes especially challenging when learning with weak supervision---where only a limited number of labeled sentences are  ...  In this paper, we leverage a key insight that retrieving sentences expressing a relation is a dual task of predicting relation label for a given sentence---two tasks are complementary to each other and  ...  This section introduces our dual learning approach to semi-supervised relation extraction.  ... 
arXiv:1902.07814v2 fatcat:wv4qtt7wbrfipbp4dh3e7hrowm

GearNet: Stepwise Dual Learning for Weakly Supervised Domain Adaptation [article]

Renchunzi Xie, Hongxin Wei, Lei Feng, Bo An
2022 arXiv   pre-print
This interactive learning schema enables implicit label noise canceling and exploits correlations between the source and target domains.  ...  This paper studies weakly supervised domain adaptation(WSDA) problem, where we only have access to the source domain with noisy labels, from which we need to transfer useful information to the unlabeled  ...  Each model is trained with a supervised learning loss on one domain, and a symmetric Kullback Leibler divergence loss to mimic the predictions of its dual model on the other domain, where z * * is the  ... 
arXiv:2201.06001v2 fatcat:l2xslolx4bdg5ami5hdrmpviam

Dual Relation Semi-Supervised Multi-Label Learning

Lichen Wang, Yunyu Liu, Can Qin, Gan Sun, Yun Fu
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
To this end, we proposed a Dual Relation Semi-supervised Multi-label Learning (DRML) approach which jointly explores the feature distribution and the label relation simultaneously.  ...  A dual-classifier domain adaptation strategy is proposed to align features while generating pseudo labels to improve learning performance.  ...  Conclusion In this paper, we proposed a Dual Relation Multi-label Learning (DRML) approach for Semi-supervised manner.  ... 
doi:10.1609/aaai.v34i04.6089 fatcat:kn3rlarcyfekxnfttiggqrfr3i

Semi-supervised dual graph regularized dictionary learning [article]

Khanh-Hung Tran, Fred-Maurice Ngole-Mboula, Jean-Luc Starck
2018 arXiv   pre-print
In this paper, we propose a semi-supervised dictionary learning method that uses both the information in labelled and unlabelled data and jointly trains a linear classifier embedded on the sparse codes  ...  Numerical experiments In this section, we apply our approach Semi-Supervised Dual-Graph regularized Dictionary Learning (SS-DG-DL) on MNIST dataset which contains 70000 images (28×28) of handwritten digits  ...  Conclusion We introduced a semi supervised dictionary learning algorithm which, unlike current state-of-the-art supervised dictionary learning methods, makes use of both labelled and unlabelled data in  ... 
arXiv:1812.04456v1 fatcat:4wppqc73hjc7laf4j7uws3nmiq

Dual-stream Multiple Instance Learning Network for Whole Slide Image Classification with Self-supervised Contrastive Learning [article]

Bin Li, Yin Li, Kevin W. Eliceiri
2021 arXiv   pre-print
Second, since WSIs can produce large or unbalanced bags that hinder the training of MIL models, we propose to use self-supervised contrastive learning to extract good representations for MIL and alleviate  ...  First, we introduce a novel MIL aggregator that models the relations of the instances in a dual-stream architecture with trainable distance measurement.  ...  dual-stream architecture.  ... 
arXiv:2011.08939v3 fatcat:igrzeve6ergnlk6bn35peh4kcy

Dual GNNs: Graph Neural Network Learning with Limited Supervision [article]

Abdullah Alchihabi, Yuhong Guo
2021 arXiv   pre-print
By integrating the two modules in a dual GNN learning framework, we perform joint learning in an end-to-end fashion. This general framework can be applied on many GNN baseline models.  ...  In this paper, we propose a novel Dual GNN learning framework to address this challenge task. The proposed framework has two GNN based node prediction modules.  ...  Dual GNN Learning Framework In this section, we present the proposed Dual GNN learning framework for semi-supervised node classification, which aims to empower a given standard GNN base model to handle  ... 
arXiv:2106.15755v1 fatcat:st4boznnercl5jh6dqoqqinaqy

Dual supervised learning for non-native speech recognition

Kacper Radzikowski, Robert Nowak, Le Wang, Osamu Yoshie
2019 EURASIP Journal on Audio, Speech, and Music Processing  
In this paper, we address this issue by employing dual supervised learning (DSL) and reinforcement learning with policy gradient methodology.  ...  The methodology we used in our experiments is based on the dual supervised learning (DSL) technique [12] .  ...  Abbreviations ASR: Automatic speech recognition; DSL: Dual supervised learning; LSTM: Long short-term memory (network); M STT : Speech recognition model; M TTS : Speech synthesis model; M L : Language  ... 
doi:10.1186/s13636-018-0146-4 fatcat:pnfntaxdjjcidk5t75waucffbu

Weakly supervised learning of indoor geometry by dual warping [article]

Pulak Purkait and Ujwal Bonde and Christopher Zach
2018 arXiv   pre-print
In this work we address the task of 3D prediction especially for indoor scenes by leveraging only weak supervision. In the literature 3D scene prediction is usually solved via a 3D voxel grid.  ...  No that no direct supervision for the source and target flow is provided and let the network learn the flow directions for which the loss is minimum. P to the original pose.  ...  Thus, we replace the strongly supervised task by a weakly supervised one, which-as a by-product-turns out to be also less challenging in terms of problem difficulty (see below).  ... 
arXiv:1808.03609v1 fatcat:ih7l2ohjbrdjpcjmun763yzyia

ADT-SSL: Adaptive Dual-Threshold for Semi-Supervised Learning [article]

Zechen Liang, Yuan-Gen Wang, Wei Lu, Xiaochun Cao
2022 arXiv   pre-print
This paper proposes an Adaptive Dual-Threshold method for Semi-Supervised Learning (ADT-SSL).  ...  Semi-Supervised Learning (SSL) has advanced classification tasks by inputting both labeled and unlabeled data to train a model jointly.  ...  Motivated by the above limitations, we propose an Adaptive Dual-Threshold method for Semi-Supervised Learning (ADT-SSL).  ... 
arXiv:2205.10571v1 fatcat:aneowgdlmvfufcsdlmdezgnqm4
« Previous Showing results 1 — 15 out of 101,619 results