Filters








339,880 Hits in 6.3 sec

Bio-JOIE

Junheng Hao, Chelsea J.-T Ju, Muhao Chen, Yizhou Sun, Carlo Zaniolo, Wei Wang
2020 Proceedings of the 11th ACM International Conference on Bioinformatics, Computational Biology and Health Informatics  
In this paper, we propose the transferred multi-relational embedding model Bio-JOIE to capture the knowledge of gene ontology and PPI networks, which demonstrates superb capability in modeling the SARS-CoV  ...  On top of that, the transfer model learns a non-linear transformation to transfer the knowledge of PPIs and gene ontology annotations across their embedding spaces.  ...  This allows the relational knowledge to transfer across and complement the learning and inference on both domains.  ... 
doi:10.1145/3388440.3412477 dblp:conf/bcb/HaoJCSZ020 fatcat:jqvjndnnjzf25mukvij4ozj4vq

Weakly-supervised Domain Adaption for Aspect Extraction via Multi-level Interaction Transfer [article]

Tao Liang, Wenya Wang, Fengmao Lv
2020 arXiv   pre-print
Specifically, the aspect category information is used to construct pivot knowledge for transfer with assumption that the interactions between sentence-level aspect category and token-level aspect terms  ...  To address this limitation, some previous works propose domain adaptation strategies to transfer knowledge from a sufficiently labeled source domain to unlabeled target domains.  ...  Domain Adaptation Domain adaptation is an essential problem in transfer learning.  ... 
arXiv:2006.09235v1 fatcat:t6diemitzfe2fneaq5lmixbci4

Bio-JOIE: Joint Representation Learning of Biological Knowledge Bases [article]

Junheng Hao, Chelsea Jui-Ting Ju, Muhao Chen, Yizhou Sun, Carlo Zaniolo, Wei Wang
2020 bioRxiv   pre-print
In this paper, we propose the transferred multi-relational embedding model Bio-JOIE to capture the knowledge of gene ontology and PPI networks, which demonstrates superb capability in modeling the SARS-CoV  ...  On top of that, the transfer model learns a non-linear transformation to transfer the knowledge of PPIs and gene ontology annotations across their embedding spaces.  ...  This allows the relational knowledge to transfer across and complement the learning and inference on both domains. .  ... 
doi:10.1101/2020.06.15.153692 fatcat:lh4cm424cffcxnzgkxs3y5sbvy

Transferable Interactive Memory Network for Domain Adaptation in Fine-Grained Opinion Extraction

Wenya Wang, Sinno Jialin Pan
2019 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Very few attempts have applied unsupervised domain adaptation methods to transfer fine-grained knowledge (in the word level) from some labeled source domain(s) to any unlabeled target domain.  ...  ., common opinion terms or syntactic relations between aspect and opinion words. In this work, we propose an interactive memory network that consists of local and global memory units.  ...  We directly use the model trained on labeled source domain to make predictions in the target domain.• TIMN: The proposed Transferable Interactive Memory Network which learns a shared space through memory  ... 
doi:10.1609/aaai.v33i01.33017192 fatcat:six4ap235jgdxkfw4p4leb4sbe

Transfer Meets Hybrid: A Synthetic Approach for Cross-Domain Collaborative Filtering with Text [article]

Guangneng Hu, Yu Zhang, Qiang Yang
2019 arXiv   pre-print
Another thread is to transfer knowledge from other source domains such as improving the movie recommendation with the knowledge from the book domain, leading to transfer learning methods.  ...  We propose a novel neural model to smoothly enable Transfer Meeting Hybrid (TMH) methods for cross-domain recommendation with unstructured text in an end-to-end manner.  ...  ., j s ) with which the user u has interacted in the source domain, TNet learns a transfer vector c ui ∈ R d to capture the relations between the target item i and source items given the user u.  ... 
arXiv:1901.07199v1 fatcat:ti7l7rv2vzca7cauwh4iidaceq

DDTCDR: Deep Dual Transfer Cross Domain Recommendation [article]

Pan Li, Alexander Tuzhilin
2019 arXiv   pre-print
To address these concerns, in this paper we propose a novel approach to cross-domain recommendations based on the mechanism of dual learning that transfers information between two related domains in an  ...  and also classical transfer learning approaches.  ...  To address these problems, researchers propose to use cross domain recommendation [3] through transfer learning [21] approaches that learn user preferences in the source domain and transfer them to  ... 
arXiv:1910.05189v1 fatcat:y5mqqv3gebgqbgakxk4qzubgmq

Propagation-aware Social Recommendation by Transfer Learning [article]

Haodong Chang, Yabo Chu
2021 arXiv   pre-print
The assumption behind is that the knowledge in social user-user connections can be shared and transferred to the domain of user-item interactions, whereby to help learn user preferences.  ...  In this paper, we propose a novel Propagation-aware Transfer Learning Network (PTLN) based on the propagation of social relations.  ...  Transfer learning is also applied in social-aware recommendation to learn user preference from social connections and then transfer to item domain, leading to more fine-refined user preference and thus  ... 
arXiv:2107.04846v1 fatcat:v5dkffhuzfepfjh42glc7w7lfi

Syntactically-Meaningful and Transferable Recursive Neural Networks for Aspect and Opinion Extraction

Wenya Wang, Sinno Jialin Pan
2019 Computational Linguistics  
Furthermore, we construct transferable recursive neural networks to automatically learn the domain-invariant fine-grained interactions among aspect words and opinion words.  ...  The transferability is built on an auxiliary task and a conditional domain adversarial network to reduce domain distribution difference in the hidden spaces effectively in word level through syntactic  ...  target domain in each transfer experiment.  ... 
doi:10.1162/coli_a_00362 fatcat:topw3vnee5ao7aevhc6sd7axdq

MSIT_SRIB at MEDIQA 2019: Knowledge Directed Multi-task Framework for Natural Language Inference in Clinical Domain

Sahil Chopra, Ankita Gupta, Anupama Kaushik
2019 Proceedings of the 18th BioNLP Workshop and Shared Task  
Bio-MTDNN utilizes "transfer learning" based paradigm where not only the source and target domains are different but also the source and target tasks are varied, although related.  ...  In this paper, we present Biomedical Multi-Task Deep Neural Network (Bio-MTDNN) on the NLI task of MediQA 2019 challenge (Ben Abacha et al., 2019).  ...  MT-DNN In the second scenario of transfer learning, we augment the target task T T by various related NLU tasks and train the model to perform on all of them.  ... 
doi:10.18653/v1/w19-5052 dblp:conf/bionlp/ChopraGK19 fatcat:6t7qrxx2m5e2znpk3sib4ykp4a

A Review on Learning Planning Action Models for Socio-Communicative HRI [article]

Ankuj Arora and Humbert Fiorino and Damien Pellier and Sylvie Pesty
2018 arXiv   pre-print
Thus, this entire interaction can be treated as a sequence of actions propelling the interaction from its initial to goal state, also known as a plan in the domain of AI planning.  ...  In the same domain, this action sequence that stems from plan execution can be represented as a trace.  ...  LAWS LAWS (Learn Action models with transferring knowledge from a related source domain via Web search) [31] makes use of action-models already created beforehand in other related domains, which are  ... 
arXiv:1810.09245v1 fatcat:5ga2klfbcfaife7wylepgsxtzq

Deep transfer via second-order Markov logic

Jesse Davis, Pedro Domingos
2009 Proceedings of the 26th Annual International Conference on Machine Learning - ICML '09  
In shallow transfer, test instances are from the same domain, but have a different distribution.  ...  In deep transfer, test instances are from a different domain entirely (i.e., described by different predicates).  ...  The views and conclusions contained in this document are those of the authors and should not be interpreted as necessarily representing the official policies, either expressed or implied, of ARO, DARPA  ... 
doi:10.1145/1553374.1553402 dblp:conf/icml/DavisD09 fatcat:47bijot2g5amtajp4wsb5q3vz4

Domain is a moving target for relational learning

Jeffrey S. Katz, Bradley R. Sturz, Anthony A. Wright
2010 Behavioural Processes  
The domain for relational learning was manipulated by varying the training set size for pigeons that had learned the same/different (S/D) concept.  ...  On the other hand, relational learning involves learning the relationship between/among stimuli. Relational learning is the basis of novel-stimulus transfer and hence abstract-concept learning.  ...  We wish to thank Kent Bodily, Michelle Hernández, and Jackie Rivera for assisting in the research.  ... 
doi:10.1016/j.beproc.2009.12.006 pmid:20006686 pmcid:PMC2824002 fatcat:ejlhstwtk5f7dfl66f34dkpjb4

Transferable End-to-End Aspect-based Sentiment Analysis with Selective Adversarial Learning

Zheng Li, Xin Li, Ying Wei, Lidong Bing, Yu Zhang, Qiang Yang
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
Dual adversarial neural transfer for low-resource named entity recognition. In ACL, pages 3461-3471.  ...  However, such formulation hinders the effectiveness of supervised methods due to the lack of annotated sequence data in many domains.  ...  learning in an unlabeled target domain.  ... 
doi:10.18653/v1/d19-1466 dblp:conf/emnlp/LiLWBZY19 fatcat:i2g3iouz3nc5zgzrvoa2k6yb5y

Transferable End-to-End Aspect-based Sentiment Analysis with Selective Adversarial Learning [article]

Zheng Li, Xin Li, Ying Wei, Lidong Bing, Yu Zhang, Qiang Yang
2019 arXiv   pre-print
However, such formulation hinders the effectiveness of supervised methods due to the lack of annotated sequence data in many domains.  ...  To resolve it, we propose a novel Selective Adversarial Learning (SAL) method to align the inferred correlation vectors that automatically capture their latent relations.  ...  learning in an unlabeled target domain.  ... 
arXiv:1910.14192v1 fatcat:egxfn4abive27oqws4idx7vqbm

Fast and Scalable Expansion of Natural Language Understanding Functionality for Intelligent Agents [article]

Anuj Goyal, Angeliki Metallinou, Spyros Matsoukas
2018 arXiv   pre-print
We propose efficient deep neural network architectures that maximally re-use available resources through transfer learning.  ...  Fast expansion of natural language functionality of intelligent virtual agents is critical for achieving engaging and informative interactions.  ...  Transfer learning refers to transferring the knowledge gained while performing a task in a source domain D s to benefit a related task in a target domain D t .  ... 
arXiv:1805.01542v1 fatcat:lek7j373j5a6fjz6im245at74u
« Previous Showing results 1 — 15 out of 339,880 results