Filters








8,161 Hits in 8.2 sec

A Survey on Self-supervised Pre-training for Sequential Transfer Learning in Neural Networks [article]

Huanru Henry Mao
2020 arXiv   pre-print
In this review, we survey self-supervised learning methods and their applications within the sequential transfer learning framework.  ...  It involves first pre-training a model on a large amount of unlabeled data, then adapting the model to target tasks of interest.  ...  Acknowledgments Thanks to Professor Garrison W. Cottrell for providing comments, advice and editorial assistance. Thanks to Bodhisattwa Prasad Majumder for providing proofreading assistance.  ... 
arXiv:2007.00800v1 fatcat:jgjl2l7wqfaq5do4vre5fryuoe

AdapterFusion: Non-Destructive Task Composition for Transfer Learning [article]

Jonas Pfeiffer, Aishwarya Kamath, Andreas Rücklé, Kyunghyun Cho, Iryna Gurevych
2021 arXiv   pre-print
Sequential fine-tuning and multi-task learning are methods aiming to incorporate knowledge from multiple tasks; however, they suffer from catastrophic forgetting and difficulties in dataset balancing.  ...  Our code and adapters are available at AdapterHub.ml.  ...  This work was partly supported by Samsung Advanced Institute of Technology (Next Generation Deep Learning: from pattern recognition to AI) and Samsung Research (Improving Deep Learning using Latent Structure  ... 
arXiv:2005.00247v3 fatcat:rhjexrlidzcjtck5xjqcmaxmxe

Linking emotions to behaviors through deep transfer learning [article]

Haoqi Li, Brian Baucom, Panayiotis Georgiou
2019 arXiv   pre-print
In this work, we employ deep transfer learning to analyze their inferential capacity and contextual importance.  ...  Human behavior refers to the way humans act and interact. Understanding human behavior is a cornerstone of observational practice, especially in psychotherapy.  ...  input them to the adaptive max-pooling along the time axis to eliminate the sequential information.  ... 
arXiv:1910.03641v1 fatcat:i4cg5p2kubak3iv6j4v4iw7vhi

Neural Transfer Learning with Transformers for Social Science Text Analysis [article]

Sandra Wankmüller
2021 arXiv   pre-print
., 2017) and are used in a transfer learning setting have contributed to this development.  ...  Additionally, three Transformer-based models for transfer learning, BERT (Devlin et al., 2019), RoBERTa (Liu et al., 2019), and the Longformer (Beltagy et al., 2020), are compared to conventional machine  ...  #$%& Figure 4 : Pretraining and Adaptation Phase in Sequential Transfer Learning. $ !"#$%& $ '($)&' %&'(&)*+*+, !  ... 
arXiv:2102.02111v1 fatcat:5ulwuvuwlncdhc6uiwaghymmym

RBN: enhancement in language attribute prediction using global representation of natural language transfer learning technology like Google BERT

Chiranjib Sur
2019 SN Applied Sciences  
Different origin sentences will ensure that while human generated sentences are complex for high accuracy and prone to errors, effective machine generated sentence attribute detection provision must be  ...  Transfer learning can replace the long and costly data collection, labeling and training session by effective and the most efficient representations.  ...  some activity like 'saw' .  ... 
doi:10.1007/s42452-019-1765-9 fatcat:brz7ttgkerhbnd6sawrclilyou

Linking emotions to behaviors through deep transfer learning

Haoqi Li, Brian Baucom, Panayiotis Georgiou
2020 PeerJ Computer Science  
In this work, we employ deep transfer learning to analyze their inferential capacity and contextual importance.  ...  Human behavior refers to the way humans act and interact. Understanding human behavior is a cornerstone of observational practice, especially in psychotherapy.  ...  ACKNOWLEDGEMENTS The authors would like to thank Yi Zhou, Sandeep Nallan Chakravarthula and professor Shrikanth Narayanan for helpful discussions and comments.  ... 
doi:10.7717/peerj-cs.246 pmid:33816898 pmcid:PMC7924597 fatcat:6g5wakxejbajdnriqpzbsuwc2q

Decision support from financial disclosures with deep neural networks and transfer learning

Mathias Kraus, Stefan Feuerriegel
2017 Decision Support Systems  
While humans are usually able to correctly interpret the content, the same is rarely true of computerized decision support systems, which struggle with the complexity and ambiguity of natural language.  ...  We additionally experiment with transfer learning, in which we pre-train the network on a different corpus with a length of 139.1 million words.  ...  Classification: direction of abnormal returns With regard to deep learning, both RNNs (with and without transfer learning) fail to improve performance beyond traditional machine learning models.  ... 
doi:10.1016/j.dss.2017.10.001 fatcat:e6kpya35gzcwnmp2mehgq33nfe

Augmenting semantic lexicons using word embeddings and transfer learning [article]

Thayer Alshaabi, Colin M. Van Oort, Mikaela Irene Fudolig, Michael V. Arnold, Christopher M. Danforth, Peter Sheridan Dodds
2021 arXiv   pre-print
Our first model establishes a baseline employing a simple and shallow neural network initialized with pre-trained word embeddings using a non-contextual approach.  ...  Our second model improves upon our baseline, featuring a deep Transformer-based network that brings to bear word definitions to estimate their lexical polarity.  ...  We thank Anne Marie Stupinski and Julia Zimmerman for their insightful discussion and suggestions.  ... 
arXiv:2109.09010v2 fatcat:bjc6dvilgzeirgtr4hvdox3xby

Text Style Transfer: A Review and Experimental Evaluation [article]

Zhiqiang Hu, Roy Ka-Wei Lee, Charu C. Aggarwal, Aston Zhang
2021 arXiv   pre-print
This article aims to provide a comprehensive review of recent research efforts on text style transfer.  ...  More concretely, we create a taxonomy to organize the TST models and provide a comprehensive summary of the state of the art.  ...  [27] calculated the cosine similarity between original sentence embeddings and transferred sentence embeddings.  ... 
arXiv:2010.12742v2 fatcat:gmkjxf7f7jhivbo6mayaxjsk7q

Do Transformer Modifications Transfer Across Implementations and Applications? [article]

Sharan Narang, Hyung Won Chung, Yi Tay, William Fedus, Thibault Fevry, Michael Matena, Karishma Malkan, Noah Fiedel, Noam Shazeer, Zhenzhong Lan, Yanqi Zhou, Wei Li (+4 others)
2021 arXiv   pre-print
The research community has proposed copious modifications to the Transformer architecture since it was introduced over three years ago, relatively few of which have seen widespread adoption.  ...  We conjecture that performance improvements may strongly depend on implementation details and correspondingly make some recommendations for improving the generality of experimental results.  ...  Activations We consider various activation functions to replace the ReLU in the feedforward network block.  ... 
arXiv:2102.11972v2 fatcat:w6y6mkrw7vavnkkrcmzgz7roee

Lex Rosetta: Transfer of Predictive Models Across Languages, Jurisdictions, and Legal Domains [article]

Jaromir Savelka, Hannes Westermann, Karim Benyekhlef, Charlotte S. Alexander, Jayla C. Grant, David Restrepo Amariles, Rajaa El Hamdani, Sébastien Meeùs, Michał Araszkiewicz, Kevin D. Ashley, Alexandra Ashley, Karl Branting (+6 others)
2021 arXiv   pre-print
In this paper, we examine the use of multi-lingual sentence embeddings to transfer predictive models for functional segmentation of adjudicatory decisions across jurisdictions, legal systems (common and  ...  To investigate transfer between different contexts we developed an annotation scheme for functional segmentation of adjudicatory decisions.  ...  embeddings for zero-shot cross-lingual transfer and beyond. Transactions of the [32] Linyuan Tang and Kyo Kageura. 2019.  ... 
arXiv:2112.07882v1 fatcat:jpnu4ldsnva65kya4ut7ch2fgm

Transfer Learning in Sentiment Classification with Deep Neural Networks [chapter]

Andrea Pagliarani, Gianluca Moro, Roberto Pasolini, Giacomo Domeniconi
2019 Primate Life Histories, Sex Roles, and Adaptability  
Deep neural networks have recently reached the state-of-the-art in many NLP tasks, including in-domain sentiment classification, but few of them involve transfer learning and cross-domain sentiment solutions  ...  Deep neural networks have recently reached the state-of-the-art in many NLP tasks, including in-domain sentiment classification, but few of them involve transfer learning and cross-domain sentiment solutions  ...  Socher et al. introduced the Recursive Neural Tensor Networks to foster single sentence sentiment classification [11] .  ... 
doi:10.1007/978-3-030-15640-4_1 dblp:conf/ic3k/PagliaraniMPD17 fatcat:gih2lwn36ze4jmuu64idncgfl4

Deep Transfer Learning Beyond: Transformer Language Models in Information Systems Research [article]

Ross Gruetzemacher, David Paradice
2021 arXiv   pre-print
Recent progress in natural language processing involving transformer language models (TLMs) offers a potential avenue for AI-driven business and societal transformation that is beyond the scope of what  ...  This is possible because these techniques make it easier to develop very powerful custom systems and their performance is superior to existing methods for a wide range of tasks and applications.  ...  and beyond.  ... 
arXiv:2110.08975v2 fatcat:bw6rzrz2zvdyrgraxoxdraf4d4

Improving Botnet Detection with Recurrent Neural Network and Transfer Learning [article]

Jeeyung Kim, Alex Sim, Jinoh Kim, Kesheng Wu, Jaegyoon Hahm
2021 arXiv   pre-print
Additionally, we devise a transfer learning framework to learn from a well-curated source data set and transfer the knowledge to a target problem domain not seen before.  ...  Another common shortcoming of ML-based approaches is the need to retrain neural networks in order to detect the evolving botnets; however, the training process is time-consuming and requires significant  ...  Du et al. regard every feature as sentence and embed the features. With embedded features, classifier can be trained to detect malicious botnets [19] .  ... 
arXiv:2104.12602v1 fatcat:cxo37mdyavhxllftslowvfktrq

Knowledge Transfer Among Cross-Functional Teams As A Continual Improvement Process

Sergio Mauricio Pérez López, Luis Rodrigo Valencia Pérez, Juan Manuel Peña Aguilar, Adelina Morita Alexander
2016 Zenodo  
Adaptation will ensure that the transfer process is carried out smoothly by preventing "stickiness".  ...  The concept of adapting knowledge to new environments will be highlighted, as it is essential for companies to translate and modify information so that such information can fit the context of receiving  ...  According to [2] , knowledge is embedded in three basic elements of an organization: people, tools and tasks and the various sub-networks formed by combining these basic elements.  ... 
doi:10.5281/zenodo.1125175 fatcat:kpydzczporannexta35pyh5sni
« Previous Showing results 1 — 15 out of 8,161 results