A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Task-Independent Knowledge Makes for Transferable Representations for Generalized Zero-Shot Learning
[article]
2021
arXiv
pre-print
Generalized Zero-Shot Learning (GZSL) targets recognizing new categories by learning transferable image representations. ...
Consequently, the task-specific and task-independent knowledge jointly make for transferable representations of DCEN, which obtains averaged 4.1% improvement on four public benchmarks. ...
Thus, a trade-off between task-independent knowledge and task-specific knowledge is critical to make for strong transferable representations in GZSL. ...
arXiv:2104.01832v1
fatcat:kh2u4ukuxnh2phlovnswnf52mq
Language Models as Zero-shot Visual Semantic Learners
[article]
2021
arXiv
pre-print
We show that the knowledge encoded in transformer language models can be exploited for tasks requiring visual semantic understanding.The VSEP with contextual representations can distinguish word-level ...
object representations in complicated scenes as a compositional zero-shot learner. ...
Zero-shot Learning with VSEPs Next we test the VSEPs with contextual representations in a zero-shot learning task to explore whether other kind of knowledge in a language model is learned by a VSEP. ...
arXiv:2107.12021v1
fatcat:7ty373ume5e23drdoczoc6engm
Recent Advances in Zero-shot Recognition
[article]
2017
arXiv
pre-print
One approach to scaling up the recognition is to develop models capable of recognizing unseen categories without any training instances, or zero-shot recognition/ learning. ...
We also overview related recognition tasks including one-shot and open set recognition which can be used as natural extensions of zero-shot recognition when limited number of class samples become available ...
Yanwei Fu is supported by The Program for Professor of Special Appointment (Eastern Scholar) at Shanghai Institutions of Higher Learning. ...
arXiv:1710.04837v1
fatcat:u3mp6dgj2rgqrarjm4dcywegmy
Using Task Descriptions in Lifelong Machine Learning for Improved Performance and Zero-Shot Transfer
[article]
2017
arXiv
pre-print
Given only the descriptor for a new task, the lifelong learner is also able to accurately predict a model for the new task through zero-shot learning using the coupled dictionary, eliminating the need ...
Knowledge transfer between tasks can improve the performance of learned models, but requires an accurate estimate of the inter-task relationships to identify the relevant knowledge to transfer. ...
We would like to thank the anonymous reviewers of the conference version of this paper for their helpful feedback. ...
arXiv:1710.03850v1
fatcat:y42nqcdh4zah5csz5fmp2ylkpe
Non-generative Generalized Zero-shot Learning via Task-correlated Disentanglement and Controllable Samples Synthesis
[article]
2022
arXiv
pre-print
Synthesizing pseudo samples is currently the most effective way to solve the Generalized Zero-Shot Learning (GZSL) problem. ...
In addation, to describe the new scene that is the limit seen class samples in the training process, we further formulate a new ZSL task named the 'Few-shot Seen class and Zero-shot Unseen class learning ...
Related Works
Generalized Zero-Shot Learning Domain shift is a basic problem in GZSL. ...
arXiv:2203.05335v3
fatcat:w74w2jzp4nh53nmwgye2efkbim
Generalized Zero-Shot Learning for Action Recognition with Web-Scale Video Data
[article]
2017
arXiv
pre-print
Then, we propose a method for action recognition by deploying generalized zero-shot learning, which transfers the knowledge of web video to detect the anomalous actions in surveillance videos. ...
Our experimental results demonstrate that, under the generalize setting, typical zero-shot learning methods are no longer effective for the dataset we applied. ...
Fig. 1 presents the differences of zero-shot and generalized zero-shot learning tasks. ...
arXiv:1710.07455v1
fatcat:datwl63c5jd2hiylkz7636lra4
Using Task Descriptions in Lifelong Machine Learning for Improved Performance and Zero-Shot Transfer
2020
The Journal of Artificial Intelligence Research
Given only the descriptor for a new task, the lifelong learner is also able to accurately predict a model for the new task through zero-shot learning using the coupled dictionary, eliminating the need ...
Knowledge transfer between tasks can improve the performance of learned models, but requires an accurate estimate of inter-task relationships to identify the relevant knowledge to transfer. ...
We would like to thank the anonymous reviewers of the conference version of this paper as well as the current version for their helpful feedback. ...
doi:10.1613/jair.1.11304
fatcat:gjbed6fp5jgaxpusimdxutanwi
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer
[article]
2020
arXiv
pre-print
downstream zero-shot cross-lingual transfer. ...
Our zero-shot cross-lingual transfer experiments, involving three tasks (POS-tagging, NER, NLI) and a set of 10 diverse languages, 1) point to the usefulness of orthoadapters in cross-lingual transfer, ...
zero-shot crosslingual transfer in NLP. ...
arXiv:2012.06460v1
fatcat:s4cb63glnvgz7hqwjt7j3wlbq4
A Survey on Visual Transfer Learning using Knowledge Graphs
[article]
2022
arXiv
pre-print
Transfer learning is the area of machine learning that tries to prevent these errors. ...
This survey focuses on visual transfer learning approaches using KGs. ...
Zero-Shot Learning Datasets without Auxiliary Knowledge We introduce image datasets that have been applied mainly for zero-shot learning or few-shot learning tasks. ...
arXiv:2201.11794v1
fatcat:tapql5h4j5dvrnxjkaxek2cquu
Convolutional Prototype Learning for Zero-Shot Recognition
[article]
2020
arXiv
pre-print
In this paper, we propose a simple yet effective convolutional prototype learning (CPL) framework for zero-shot recognition. ...
By assuming distribution consistency at task-level, our CPL is capable of transferring knowledge smoothly to recognize unseen samples.Furthermore, inside each task, discriminative visual prototypes are ...
CPL: algorithm Generic zero-shot learning models usually make a distribution consistency assumption between the training and test sets (i.e., independent and identically distributed assumption), thus guaranteeing ...
arXiv:1910.09728v3
fatcat:vebdwnncrvd75fzwae76u5wsr4
Abstraction: A Framework for Knowledge Transfer Between Domains
2021
Zenodo
Second, it suggests a practical algorithm for zero-shot simulation to reality transfer. We demonstrate this framework on two challenging tasks: drone racing and high-speed navigation in the wild. ...
In this extended abstract, we propose a theoretical framework for knowledge transfer between domains, e.g. from simulation to the real world. ...
We present a general approach to for zero-shot transfer of sensorimotor policies from simulation (left) to real world (right). ...
doi:10.5281/zenodo.6367974
fatcat:4bfz3y7uvrfsjncfujwxdticxq
Zero-Shot and Few-Shot Classification of Biomedical Articles in Context of the COVID-19 Pandemic
[article]
2022
arXiv
pre-print
In this work, we hypothesise that rich semantic information available in MeSH has potential to improve BioBERT representations and make them more suitable for zero-shot/few-shot tasks. ...
Zero-shot classification is an adequate response for timely labeling of the stream of papers with MeSH categories. ...
Such pretraining allows them to learn rich semantic representation of the text, and perform knowledge transfer on other lowerresourced tasks. ...
arXiv:2201.03017v2
fatcat:w7vdufiuyrfblli4triyk4mtgq
Abstraction: A Framework for Knowledge Transfer Between Domains
2021
Zenodo
Second, it suggests a practical algorithm for zero-shot simulation to reality transfer. We demonstrate this framework on two challenging tasks: drone racing and high-speed navigation in the wild. ...
In this extended abstract, we propose a theoretical framework for knowledge transfer between domains, e.g. from simulation to the real world. ...
We present a general approach to for zero-shot transfer of sensorimotor policies from simulation (left) to real world (right). ...
doi:10.5281/zenodo.5900625
fatcat:6galxdadunf3jgbxbs6a7vuhr4
A survey on visual transfer learning using knowledge graphs
2022
Semantic Web Journal
Transfer learning is the area of machine learning that tries to prevent these errors. ...
This survey focuses on visual transfer learning approaches using KGs, as we believe that KGs are well suited to store and represent any kind of auxiliary knowledge. ...
Acknowledgements This publication was created as part of the research project "KI Delta Learning" (project number: 19A19013D) funded by the Federal Ministry for Economic Affairs and Energy (BMWi) on the ...
doi:10.3233/sw-212959
fatcat:f4s43if3nbcxxfvrbtpdrrs2ry
Linguistically-Enriched and Context-Aware Zero-shot Slot Filling
[article]
2021
arXiv
pre-print
Step two fine-tunes these rich representations and produces slot-independent tags for each word. ...
We propose a new zero-shot slot filling neural model, LEONA, which works in three steps. ...
Furthermore, these representations are purely character based and are robust for words unseen during training, which makes them suitable for the task of zero-shot slot filling. ...
arXiv:2101.06514v1
fatcat:6xvgetmcwnburorej27h3kmv5q
« Previous
Showing results 1 — 15 out of 15,966 results