Filters








3,562 Hits in 3.9 sec

Sequential Recommendation with Self-Attentive Multi-Adversarial Network [article]

Ruiyang Ren, Zhaoyang Liu, Yaliang Li, Wayne Xin Zhao, Hui Wang, Bolin Ding, Ji-Rong Wen
2020 pre-print
In this paper, we present a Multi-Factor Generative Adversarial Network (MFGAN) for explicitly modeling the effect of context information on sequential recommendation.  ...  Existing neural sequential recommenders typically adopt a generative way trained with Maximum Likelihood Estimation (MLE).  ...  In this paper, we have proposed a Multi-Factor Generative Adversarial Network (MFGAN) for sequential recommendation.  ... 
doi:10.1145/3397271.3401111 arXiv:2005.10602v1 fatcat:2blluqq6sve2jfnbazx5x5wpvq

MTPR: A Multi-Task Learning Based POI Recommendation Considering Temporal Check-Ins and Geographical Locations

Bin Xia, Yuxuan Bai, Junjie Yin, Qi Li, Lijie Xu
2020 Applied Sciences  
In this paper, we propose a multi-task learning model based POI recommender system which exploits a structure of generative adversarial networks (GAN) simultaneously considering temporal check-ins and  ...  The multi-task learning strategy is capable of combining the information of temporal check-ins and geographical locations to improve the performance of personalized POI recommendation.  ...  Vaswani et al. proposed a generic transformer based solely on a self-attention mechanism and also introduced a new form of attention named the multi-head attention, which was inspired by the multiple kernels  ... 
doi:10.3390/app10196664 fatcat:7inzkhfcr5afppwphjcvkup5vu

SDM: Sequential Deep Matching Model for Online Large-scale Recommender System [article]

Fuyu Lv, Taiwei Jin, Changlong Yu, Fei Sun, Quan Lin, Keping Yang, Wilfred Ng
2019 arXiv   pre-print
We propose to encode behavior sequences with two corresponding components: multi-head self-attention module to capture multiple types of interests and long-short term gated fusion module to incorporate  ...  Successive items are recommended after matching between sequential user behavior vector and item embedding vectors.  ...  Sequential Deep Matching with Multi-head Attention is our multi-head self-attention enhanced model. • PSDMMA.  ... 
arXiv:1909.00385v2 fatcat:dyrojo7dvzbqjfp2bmfxfwmplu

Adversarial Neural Trip Recommendation [article]

Linlang Jiang, Jingbo Zhou, Tong Xu, Yanyan Li, Hao Chen, Jizhou Huang, Hui Xiong
2021 arXiv   pre-print
To that end, we propose an Adversarial Neural Trip Recommendation (ANT) framework to tackle the above challenges.  ...  Another novelty of ANT relies on an adversarial learning strategy integrating with reinforcement learning to guide the trip generator to produce high-quality trips.  ...  Concretely, the encoder takes advantage of multi-head self-attention to capture correlations among POIs.  ... 
arXiv:2109.11731v1 fatcat:vc7ne66l5bahrpbsrv3m3ns4c4

Self-Supervised Graph Co-Training for Session-based Recommendation [article]

Xin Xia, Hongzhi Yin, Junliang Yu, Yingxia Shao, Lizhen Cui
2021 arXiv   pre-print
In this paper, for informative session-based data augmentation, we combine self-supervised learning with co-training, and then develop a framework to enhance session-based recommendation.  ...  Compared with other recommendation paradigms, session-based recommendation suffers more from the problem of data sparsity due to the very limited short-term interactions.  ...  it with the sequential behavior to generate the recommendations. • STAMP [20] : adopts attention layers to replace all RNN encoders in the previous work and employs the self-attention mechanism [34]  ... 
arXiv:2108.10560v1 fatcat:ueds2mnk6be3pd5x3eujuyb6qe

A Survey on Deep Learning Based Point-Of-Interest (POI) Recommendations [article]

Md. Ashraful Islam, Mir Mahathir Mohammad, Sarkar Snigdha Sarathi Das, Mohammed Eunus Ali
2020 arXiv   pre-print
A POI recommendation technique essentially exploits users' historical check-ins and other multi-modal information such as POI attributes and friendship network, to recommend the next set of POIs suitable  ...  Location-based Social Networks (LBSNs) enable users to socialize with friends and acquaintances by sharing their check-ins, opinions, photos, and reviews.  ...  [48] proposed Geography-aware sequential recommender based on the Self-Attention Network (GeoSAN) uses a geography-aware self-attention network and geography encoder.  ... 
arXiv:2011.10187v1 fatcat:3uampnqerfdvnpuzrxcrsjviwq

Deep Learning for Sequential Recommendation: Algorithms, Influential Factors, and Evaluations [article]

Hui Fang, Danning Zhang, Yiheng Shu, Guibing Guo
2020 arXiv   pre-print
In the field of sequential recommendation, deep learning (DL)-based methods have received a lot of attention in the past few years and surpassed traditional models such as Markov chain-based and factorization-based  ...  In this view, this survey focuses on DL-based sequential recommender systems by taking the aforementioned issues into consideration.  ...  [140] designed a multi-order attention network which is instantiated with two k-layer residual networks to model individual-level and union-level item dependencies respectively. (2) Self-attention mechanisms  ... 
arXiv:1905.01997v3 fatcat:i7hvdiqjpnaupcq2osrblttb4u

Sentiment Enhanced Multi-modal Hashtag Recommendation for Micro-Videos

Chao Yang, Xiaochan Wang, Bin Jiang
2020 IEEE Access  
by an attention neural network.  ...  Specifically, the multi-modal content features and the multi-modal sentiment features are modeled by a content common space learning branch based on self-attention and a sentiment common space learning  ...  shared common space with the same length; 2) self-attentive content common space learning network which captures the sequential information of modalities with three parallel Bi-LSTMs; and 3) hashtag prediction  ... 
doi:10.1109/access.2020.2989473 fatcat:n4uxa34lj5hdzdbqflv5sjdqpm

Survey for Trust-aware Recommender Systems: A Deep Learning Perspective [article]

Manqing Dong, Feng Yuan, Lina Yao, Xianzhi Wang, Xiwei Xu, Liming Zhu
2020 arXiv   pre-print
A significant remaining challenge for existing recommender systems is that users may not trust the recommender systems for either lack of explanation or inaccurate recommendation results.  ...  This survey provides a systemic summary of three categories of trust-aware recommender systems: social-aware recommender systems that leverage users' social relationships; robust recommender systems that  ...  [44] propose a recurrent network based model with attention for temporal recommendation (see Figure 2 ).  ... 
arXiv:2004.03774v2 fatcat:q7mehir7hbbzpemw3q5fkby5ty

A Survey on Reinforcement Learning for Recommender Systems [article]

Yuanguo Lin, Yong Liu, Fan Lin, Pengcheng Wu, Wenhua Zeng, Chunyan Miao
2021 arXiv   pre-print
Nevertheless, there are various challenges of RL when applying in recommender systems.  ...  Recently, Reinforcement Learning (RL) based recommender systems have become an emerging research topic.  ...  Finally, the profile reviser is trained with the recommendation model based on attention networks to provide better recommendation results.  ... 
arXiv:2109.10665v1 fatcat:whrqgxcb4fa53omquvpy6nitjm

HyperTeNet: Hypergraph and Transformer-based Neural Network for Personalized List Continuation [article]

Vijaikumar M, Deepesh Hada, Shirish Shevade
2021 arXiv   pre-print
We use graph convolutions to learn the multi-hop relationship among the entities of the same type and leverage a self-attention-based hypergraph neural network to learn the ternary relationships among  ...  In this work, we propose HyperTeNet -- a self-attention hypergraph and Transformer-based neural network architecture for the personalized list continuation task to address the challenges mentioned above  ...  SASRec [1] : Self-Attention based Sequential Recommendation (SASRec) uses attention mechanisms to capture longterm sequences by attentively selecting few actions for next item recommendation.  ... 
arXiv:2110.01467v2 fatcat:todwosunifbjlb5z5lm2tkttia

2021 Index IEEE Transactions on Multimedia Vol. 23

2021 IEEE transactions on multimedia  
., +, TMM 2021 365-377 IPTV Channel Zapping Recommendation With Attention Mechanism.  ...  ., +, TMM 2021 2361-2371 Recommender systems Adversarial Learning for Personalized Tag Recommendation.  ...  ., Low-Rank Pairwise Align- ment Bilinear Network For Few-Shot Fine-Grained Image Classification; TMM 2021 1666-1680 Huang, H., see 1855 -1867 Huang, H., see Jiang, X., TMM 2021 2602-2613 Huang, J.,  ... 
doi:10.1109/tmm.2022.3141947 fatcat:lil2nf3vd5ehbfgtslulu7y3lq

An Attentive Survey of Attention Models [article]

Sneha Chaudhari, Varun Mithal, Gungor Polatkan, Rohan Ramanath
2021 arXiv   pre-print
Attention Model has now become an important concept in neural networks that has been researched within diverse application domains.  ...  We also describe how attention has been used to improve the interpretability of neural networks. Finally, we discuss some future research directions in attention.  ...  Self-Attention Generative Adversarial Networks (SAGANs) [Zhang et al. 2019a ] use a self-attention mechanism into convolutional GANs by calculating the response at a position as a weighted sum of the  ... 
arXiv:1904.02874v3 fatcat:fyqgqn7sxzdy3efib3rrqexs74

New Ideas and Trends in Deep Multimodal Content Understanding: A Review [article]

Wei Chen and Weiping Wang and Li Liu and Michael S. Lew
2020 arXiv   pre-print
where monomodal image classifiers such as VGG, ResNet and Inception module are central topics, this paper will examine recent multimodal deep models and structures, including auto-encoders, generative adversarial  ...  Furthermore, adversarial learning is also combined with a self-attention mechanism to obtain attended regions and unattended regions.  ...  Self-attention has been used in different ways. For example, Gao et al. [194] combine the attentive vectors form self-attention with co-attention using element-wise product.  ... 
arXiv:2010.08189v1 fatcat:2l7molbcn5hf3oyhe3l52tdwra

Improved Relativistic Cycle-consistent GAN with Dilated Residual Network and Multi-Attention for Speech Enhancement

Yutian Wang, Guochen Yu, Jingling Wang, Hui Wang, Qin Zhang
2020 IEEE Access  
The proposed multi-attention uses attention mechanism in two different ways: attention gates in U-net [31] encoding-decoding layers (AU gate) and self-attention [32] in dilated residual networks [  ...  in the reconstruction stage and alleviate the loss of speech structure. 5) CRGAN with a multi-attention mechanism (MA-CRGAN): Multi-attention mechanism composed of the combination of attention u-net gates  ... 
doi:10.1109/access.2020.3029417 fatcat:62lbvfebbvegraumhbdmtrdc64
« Previous Showing results 1 — 15 out of 3,562 results