A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Omni-Training: Bridging Pre-Training and Meta-Training for Few-Shot Learning
[article]
2022
arXiv
pre-print
Few-shot learning aims to fast adapt a deep model from a few examples. While pre-training and meta-training can create deep models powerful for few-shot generalization, we find that pre-training and meta-training focuses respectively on cross-domain transferability and cross-task transferability, which restricts their data efficiency in the entangled settings of domain shift and task shift. We thus propose the Omni-Training framework to seamlessly bridge pre-training and meta-training for
arXiv:2110.07510v3
fatcat:wdeptqxzdfeyxn3uvv5a5mdfhu