A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Neural Network based Deep Transfer Learning for Cross-domain Dependency Parsing
[article]
2019
arXiv
pre-print
In addition, to adapt three dif-ferent domains, we utilize neural network based deep transfer learning which transfers the pre-trained partial network in the source domain to be a part of deep neural network ...
Considering the im-portance of context, we utilize self-attention mechanism for the representa-tion vectors to capture the meaning of words. ...
Inspired by the recent success of transfer learning in many natural language processing problems, we utilize neural network based deep transfer learning for cross-domain dependency parsing. ...
arXiv:1908.02895v1
fatcat:4cb2dmbgubfdhlcybqh22olvwi
Graph Domain Adversarial Transfer Network for Cross-Domain Sentiment Classification
2021
IEEE Access
INDEX TERMS Adversarial transfer learning, cross-domain sentiment classification, gradient reversal layer, projection mechanism. ...
Therefore, from a new perspective, this paper proposes the Graph Domain Adversarial Transfer Network (GDATN) based on the idea of adversarial learning, which uses the labeled source domain data to predict ...
Therefore, the model based on deep neural network is widely used to extract feature representation for cross-domain sentiment classification. Glorot et al. ...
doi:10.1109/access.2021.3061139
fatcat:fzyscc7ytvdyros34ombxjljpi
Deep Pivot-Based Modeling for Cross-language Cross-domain Transfer with Minimal Guidance
2018
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
While cross-domain and cross-language transfer have long been prominent topics in NLP research, their combination has hardly been explored. ...
In this work we consider this problem, and propose a framework that builds on pivotbased learning, structure-aware Deep Neural Networks (particularly LSTMs and CNNs) and bilingual word embeddings, with ...
We also thank Ivan Vulić for guiding us in the world of multilingual word embeddings. ...
doi:10.18653/v1/d18-1022
dblp:conf/emnlp/ZiserR18
fatcat:4qrhn5jynncjnk6rvt36kyxxtm
Graph Adaptive Semantic Transfer for Cross-domain Sentiment Classification
[article]
2022
arXiv
pre-print
Cross-domain sentiment classification (CDSC) aims to use the transferable semantics learned from the source domain to predict the sentiment of reviews in the unlabeled target domain. ...
Specifically, we present Graph Adaptive Semantic Transfer (GAST) model, an adaptive syntactic graph embedding method that is able to learn domain-invariant semantics from both word sequences and syntactic ...
Meantime, the graph-based models outperform the sequential model LSTM and some transferable models (e.g., SFA and DANN) a lot, proving that the graphical syntactic structure is important for cross-domain ...
arXiv:2205.08772v1
fatcat:kgzhmx5y3bbxjnaragigbsd3tq
Sentiment Classification for Financial Texts Based on Deep Learning
2021
Computational Intelligence and Neuroscience
The proposed method is a cross-domain transfer-learning-based method. ...
The proposed method in this paper uses the reviews of Amazon Books, DVDs, electronics, and kitchen appliances as the source domain for cross-domain learning, and the classification accuracy rates can reach ...
not suitable for the application in our paper. (2) e second category is feature representation learning methods. e authors in [14, 15] propose that cross-domain representation learning can be performed ...
doi:10.1155/2021/9524705
pmid:34671395
pmcid:PMC8523278
fatcat:ssgbkjdgenesjkhc337gwx62c4
A Convolution-LSTM-Based Deep Neural Network for Cross-Domain MOOC Forum Post Classification
2017
Information
First, we learn the feature representation for each word by considering the local contextual feature via the convolution operation. ...
In this paper, considering the biases among different courses, we propose a transfer learning framework based on a convolutional neural network and a long short-term memory model, called ConvL, to automatically ...
Cross-Domain Transfer Learning Transfer learning is a method of resolving a lack of labeled data in cross-domain classification. ...
doi:10.3390/info8030092
fatcat:4mc53e4t4bf4xg4s6jjvru5apu
Syntactically-Meaningful and Transferable Recursive Neural Networks for Aspect and Opinion Extraction
2019
Computational Linguistics
The conditional domain adversarial network helps to learn domaininvariant hidden representation for each word conditioned on the syntactic structure. ...
In this paper, we explore the constructions of recursive neural networks based on the dependency tree of each sentence for associating syntactic structure with feature learning. ...
At the same time, a conditional domain adversarial network is incorporated to learn domain-invariant word features based on their inherent syntactic structure. ...
doi:10.1162/coli_a_00362
fatcat:topw3vnee5ao7aevhc6sd7axdq
Cross Domain Sentiment Classification Techniques: A Review
2019
International Journal of Computer Applications
In this paper we present literature review of methods and techniques employed for cross domain sentiment analysis. ...
The social media corpus can span many different domains. It is difficult to get annotated data of all domains that can be used to train a learning model. ...
Feature Based Techniques In Features representation and transfer method [27] , the main task is feature representation. ...
doi:10.5120/ijca2019918338
fatcat:uubvhfmyg5bmrdjioqis72zlhi
Cross-Domain Text Sentiment Analysis Based on CNN_FT Method
2019
Information
Transfer learning is one of the popular methods for solving the problem that the models built on the source domain cannot be directly applied to the target domain in the cross-domain sentiment classification ...
This paper proposes a transfer learning method based on the multi-layer convolutional neural network (CNN). ...
Related Works
Transfer Learning Transfer learning mainly solves the distribution differences for cross-domain problems. ...
doi:10.3390/info10050162
fatcat:wal7czvtzjdjxgywtyaopnqcka
Cross-domain Aspect Category Transfer and Detection via Traceable Heterogeneous Graph Representation Learning
2019
Proceedings of the 28th ACM International Conference on Information and Knowledge Management - CIKM '19
In this study, we propose a novel problem, cross-domain aspect category transfer and detection, which faces three challenges: various feature spaces, different data distributions, and diverse output spaces ...
To address these problems, we propose an innovative solution, Traceable Heterogeneous Graph Representation Learning (THGRL). ...
CONCLUSION In this paper, we propose a traceable heterogeneous graph representation learning model (THGRL) for cross-domain aspect category transfer and detection. ...
doi:10.1145/3357384.3357989
dblp:conf/cikm/JiangWZSLL19
fatcat:35qh5tblhjfqfpm3lxfv3ile24
Transfer learning for speech and language processing
2015
2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA)
Transfer learning is closely related to multi-task learning (cross-lingual vs. multilingual), and is traditionally studied in the name of 'model adaptation'. ...
Transfer learning is a vital technique that generalizes models trained for one setting or task to other settings or tasks. ...
Deep learning provides an elegant way for cross-domain transfer learning, with its great power in learning high-level representations shared by multiple modalities [54] . ...
doi:10.1109/apsipa.2015.7415532
dblp:conf/apsipa/WangZ15
fatcat:oby5enn52batdhoewb4n3ufo4y
Transfer Learning for Speech and Language Processing
[article]
2015
arXiv
pre-print
Transfer learning is closely related to multi-task learning (cross-lingual vs. multilingual), and is traditionally studied in the name of 'model adaptation'. ...
Transfer learning is a vital technique that generalizes models trained for one setting or task to other settings or tasks. ...
Deep learning provides an elegant way for cross-domain transfer learning, with its great power in learning high-level representations shared by multiple modalities [54] . ...
arXiv:1511.06066v1
fatcat:vzl3rb5oqvauxk3cva6t5r7jzy
Identifying Transferable Information Across Domains for Cross-domain Sentiment Classification
2018
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
In this paper, we propose that words that do not change their polarity and significance represent the transferable (usable) information across domains for cross-domain sentiment classification. ...
We present a novel approach based on χ 2 test and cosine-similarity between context vector of words to identify polarity preserving significant words across domains. ...
Section 7 concludes the paper. 3
Related Work The most significant efforts in the learning of transferable knowledge for cross-domain text classification are Structured Correspondence Learning (SCL) ...
doi:10.18653/v1/p18-1089
dblp:conf/acl/BhattacharyyaDS18
fatcat:hpwymndgcnagvnjkd4etyikrge
Simultaneous Learning of Pivots and Representations for Cross-Domain Sentiment Classification
2020
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
Our model consists of two networks: a Pivot Selector that learns to detect transferable n-gram pivots from contexts, and a Transferable Transformer that learns to generate domain-invariant representations ...
Towards learning the pivots and representations simultaneously, we propose a new Transferable Pivot Transformer (TPT). ...
In this paper, we incorporate pivots into representation learning and propose the TPT model for cross-domain sentiment classification. ...
doi:10.1609/aaai.v34i05.6336
fatcat:3wbotv24w5gs5oeucrowtbspju
Low-Resource Cross-Domain Product Review Sentiment Classification Based on a CNN with an Auxiliary Large-Scale Corpus
2017
Algorithms
Glorot et al. [19] proposed a deep learning method to learn an effective representation for domain adaptation based on stacked denoising auto-encoders. ...
By introducing domain labels and sentiment labels for loss functions based on KL divergence, the model could learn a more accurate domain-specialized and sentiment-specialized representation. ...
The labeled data of the target domain is crucial for learning an effective feature representation for cross-domain sentiment classification. ...
doi:10.3390/a10030081
fatcat:5hlcodvvpreapi6utyoegcxfk4
« Previous
Showing results 1 — 15 out of 109,154 results