Filters








9 Hits in 2.1 sec

AdaShare: Learning What To Share For Efficient Deep Multi-Task Learning [article]

Ximeng Sun, Rameswar Panda, Rogerio Feris, Kate Saenko
2020 arXiv   pre-print
Specifically, our main idea is to learn the sharing pattern through a task-specific policy that selectively chooses which layers to execute for a given task in the multi-task network.  ...  Unlike existing methods, we propose an adaptive sharing approach, called AdaShare, that decides what to share across which tasks to achieve the best recognition accuracy, while taking resource efficiency  ...  Moreover, we will extend AdaShare for finding a fine-grained channel sharing pattern instead of layer-wise policy across tasks, for more efficient deep multi-task learning.  ... 
arXiv:1911.12423v2 fatcat:dlqfekp3ebe3zecsqgnddct2za

AutoMTL: A Programming Framework for Automated Multi-Task Learning [article]

Lijun Zhang, Xiao Liu, Hui Guan
2021 arXiv   pre-print
Multi-task learning (MTL) jointly learns a set of tasks.  ...  AutoMTL takes as inputs an arbitrary backbone convolutional neural network and a set of tasks to learn, then automatically produce a multi-task model that achieves high accuracy and has small memory footprint  ...  An effective approach to address the problem is multi-task learning (MTL) where a set of tasks are learned jointly to allow some parameter sharing among tasks.  ... 
arXiv:2110.13076v1 fatcat:riabbk35lvhdrnd2dzxd6534jq

Learning to Branch for Multi-Task Learning [article]

Pengsheng Guo, Chen-Yu Lee, Daniel Ulbricht
2020 arXiv   pre-print
In this work, we present an automated multi-task learning algorithm that learns where to share or branch within a network, designing an effective network topology that is directly optimized for multiple  ...  Training multiple tasks jointly in one deep network yields reduced latency during inference and better performance over the single-task counterpart by sharing certain layers of a network.  ...  AdaShare (Sun et al., 2019) learns the sharing pattern through a task-specific policy that selectively chooses which layers to execute for a given task in a multi-task setting.  ... 
arXiv:2006.01895v2 fatcat:uskaljzrbfgbrhz7ryu3h6wtam

A Tree-Structured Multi-Task Model Recommender [article]

Lijun Zhang, Xiao Liu, Hui Guan
2022 arXiv   pre-print
The major challenge is to determine where to branch out for each task given a backbone model to optimize for both task accuracy and computation efficiency.  ...  Tree-structured multi-task architectures have been employed to jointly tackle multiple vision tasks in the context of multi-task learning (MTL).  ...  It also promotes the leverage of multi-task learning to increase task performance and computation efficiency.  ... 
arXiv:2203.05092v1 fatcat:cgeitkkpnrg3nio4bcv4ujcpxq

Multi-Task Learning with Deep Neural Networks: A Survey [article]

Michael Crawshaw
2020 arXiv   pre-print
Multi-task learning (MTL) is a subfield of machine learning in which multiple tasks are simultaneously learned by a shared model.  ...  In this survey, we give an overview of multi-task learning methods for deep neural networks, with the aim of summarizing both the well-established and most recent directions within the field.  ...  ., 2017) is one of the earliest methods for learned parameter sharing in multi-task deep learning.  ... 
arXiv:2009.09796v1 fatcat:d676uupucvgrbgnvsijqcexcqi

Channel Exchanging Networks for Multimodal and Multitask Dense Image Prediction [article]

Yikai Wang, Wenbing Huang, Fuchun Sun, Fengxiang He, Dacheng Tao
2021 arXiv   pre-print
Despite the fruitful progress, existing methods for both problems are still brittle to the same challenge -- it remains dilemmatic to integrate the common information across modalities (resp. tasks) meanwhile  ...  Extensive experiments on semantic segmentation via RGB-D data and image translation through multi-domain input verify the effectiveness of our CEN compared to current state-of-the-art methods.  ...  Saenko, “Adashare: Learning what to share for efficient deep multi-task learning,” in NeurIPS, 2020. 11  ... 
arXiv:2112.02252v1 fatcat:ul4gs5dajjc5lecol6psabn4pu

Measuring and Harnessing Transference in Multi-Task Learning [article]

Christopher Fifty, Ehsan Amid, Zhe Zhao, Tianhe Yu, Rohan Anil, Chelsea Finn
2021 arXiv   pre-print
Multi-task learning can leverage information learned by one task to benefit the training of other tasks.  ...  We find these methods can lead to significant improvement over prior work on three supervised multi-task learning benchmarks and one multi-task reinforcement learning paradigm.  ...  Adashare: Learning what to share for efficient deep multi-task learning. arXiv preprint arXiv:1911.12423, 2019. Suteu, M. and Guo, Y.  ... 
arXiv:2010.15413v3 fatcat:mskg3nftpneb5lpm5o2fygumpm

Which Tasks Should Be Learned Together in Multi-task Learning? [article]

Trevor Standley, Amir R. Zamir, Dawn Chen, Leonidas Guibas, Jitendra Malik, Silvio Savarese
2020 arXiv   pre-print
when employing multi-task learning?  ...  Many computer vision applications require solving multiple tasks in real-time. A neural network can be trained to solve multiple tasks simultaneously using multi-task learning.  ...  Toyota Research Institute ("TRI") provided funds to assist the authors with their research but this article solely reflects the opinions and conclusions of its authors and not TRI or any other Toyota entity  ... 
arXiv:1905.07553v4 fatcat:mxpy5myr3nfmjd5ybdqls7bjsu

Task2Sim : Towards Effective Pre-training and Transfer from Synthetic Data [article]

Samarth Mishra, Rameswar Panda, Cheng Perng Phoo, Chun-Fu Chen, Leonid Karlinsky, Kate Saenko, Venkatesh Saligrama, Rogerio S. Feris
2022 arXiv   pre-print
Task2Sim learns this mapping by training to find the set of best parameters on a set of "seen" tasks.  ...  It is thus better to tailor synthetic pre-training data to a specific downstream task, for best performance.  ...  Acknowledgements We would like to thank the developers of TDW-Seth Alter, Abhishek Bhandwaldar and Jeremy Schwartz, for their assistance with the platform and its use.  ... 
arXiv:2112.00054v3 fatcat:rg7fihxdtrg45pnbdsdzijn7da