Filters








59,432 Hits in 2.7 sec

Contrastive Learning for Sequential Recommendation [article]

Xu Xie, Fei Sun, Zhaoyang Liu, Shiwen Wu, Jinyang Gao, Bolin Ding, Bin Cui
2021 arXiv   pre-print
To tackle that, inspired by recent advances of contrastive learning techniques in the computer version, we propose a novel multi-task model called Contrastive Learning for Sequential Recommendation (CL4SRec  ...  Sequential recommendation methods play a crucial role in modern recommender systems because of their ability to capture a user's dynamic interest from her/his historical interactions.  ...  Specifically, we propose a novel model called Contrastive Pre-training for Sequential Recommendation (CP4Rec).  ... 
arXiv:2010.14395v2 fatcat:2pissecqs5dopo5nderaunptyq

Parameter-Efficient Transfer from Sequential Behaviors for User Modeling and Recommendation [article]

Fajie Yuan, Xiangnan He, Alexandros Karatzoglou, Liguang Zhang
2020 arXiv   pre-print
Fine-tuning a large pre-trained network and adapting it to downstream tasks is an effective way to solve such tasks.  ...  However, fine-tuning is parameter inefficient considering that an entire model needs to be re-trained for every new task.  ...  This subsection offers several insightful findings: (1) By contrasting PeterRecal and PeterRecon in Table 6 , we can draw the conclusion that better pre-training models for sequential recommendation may  ... 
arXiv:2001.04253v4 fatcat:ulvc7rflw5gjpnfnd6cvey34ae

Contrastive Learning for Representation Degeneration Problem in Sequential Recommendation [article]

Ruihong Qiu, Zi Huang, Hongzhi Yin, Zijian Wang
2021 arXiv   pre-print
Recent advancements of sequential deep learning models such as Transformer and BERT have significantly facilitated the sequential recommendation.  ...  Specifically, in light of the uniformity property of contrastive learning, a contrastive regularization is designed for DuoRec to reshape the distribution of sequence representations.  ...  . • S 3 Rec MIP [55] applied masked contrastive pre-training as well.  ... 
arXiv:2110.05730v1 fatcat:6vwxwzylgbhsbc7op7sooskzcm

Contrastive Self-supervised Sequential Recommendation with Robust Augmentation [article]

Zhiwei Liu, Yongjun Chen, Jia Li, Philip S. Yu, Julian McAuley, Caiming Xiong
2021 arXiv   pre-print
To this end, we propose a novel framework, Contrastive Self-supervised Learning for sequential Recommendation (CoSeRec).  ...  It is challenging to devise a contrastive SSL framework for a sequential recommendation, due to its discrete nature, correlations among items, and skewness of length distributions.  ...  To this end, we propose a new framework, Contrastive Self-Supervised learning for Sequential Recommendation (CoSeRec).  ... 
arXiv:2108.06479v1 fatcat:tamq3iuqbjezlnwcnnxzfvre6q

Memory Augmented Multi-Instance Contrastive Predictive Coding for Sequential Recommendation [article]

Ruihong Qiu, Zi Huang, Hongzhi Yin
2021 arXiv   pre-print
Most existing sequential recommender models consider the next item prediction task as the training signal.  ...  The sequential recommendation aims to recommend items, such as products, songs and places, to users based on the sequential patterns of their historical records.  ...  after the contrastive pre-training.  ... 
arXiv:2109.00368v3 fatcat:vzzv2acs6vhahimvq6ildgh2w4

Hyperbolic Hypergraphs for Sequential Recommendation [article]

Yicong Li, Hongxu Chen, Xiangguo Sun, Zhenchao Sun, Lin Li, Lizhen Cui, Philip S. Yu, Guandong Xu
2021 arXiv   pre-print
(H2SeqRec) with pre-training phase.  ...  Specifically, we design three self-supervised tasks to obtain the pre-training item embeddings to feed or fuse into the following recommendation architecture (with two ways to use the pre-trained embeddings  ...  Hyperbolic Hypergraph representation learning method for Sequential Recommendation (H 2 SeqRec) with pre-training phase.  ... 
arXiv:2108.08134v1 fatcat:zifmaxbgovdffazhmeyop447y4

UPRec: User-Aware Pre-training for Recommender Systems [article]

Chaojun Xiao, Ruobing Xie, Yuan Yao, Zhiyuan Liu, Maosong Sun, Xu Zhang, Leyu Lin
2021 arXiv   pre-print
In this paper, we propose a method to enhance pre-trained models with heterogeneous user information, called User-aware Pre-training for Recommendation (UPRec).  ...  Existing sequential recommendation methods rely on large amounts of training data and usually suffer from the data sparsity problem.  ...  [20] further propose to utilize contrastive pre-training framework for sequential recommendation.  ... 
arXiv:2102.10989v1 fatcat:fsur7dod6vcurlauxqxtlkbosi

Pre-training of Context-aware Item Representation for Next Basket Recommendation [article]

Jingxuan Yang, Jun Xu, Jianzhuo Tong, Sheng Gao, Jun Guo, Jirong Wen
2019 arXiv   pre-print
Inspired by the pre-trained representations of BERT in natural language processing, we propose to conduct context-aware item representation for next basket recommendation, called Item Encoder Representations  ...  In the online recommendation phase, the pre-trained model is further fine-tuned with an additional output layer.  ...  To adapt for next basket recommendation, the original two pre-training tasks in BERT are modified.  ... 
arXiv:1904.12604v1 fatcat:au5t5hyhnfae7nvcocndobtmge

A Generic Network Compression Framework for Sequential Recommender Systems [article]

Yang Sun, Fajie Yuan, Min Yang, Guoao Wei, Zhou Zhao, Duo Liu
2020 arXiv   pre-print
Sequential recommender systems (SRS) have become the key technology in capturing user's dynamic interests and generating high-quality recommendations.  ...  To resolve the issues, we propose a compressed sequential recommendation framework, termed as CpRec, where two generic model shrinking techniques are employed.  ...  (4) RQ4: Since sequential recommender models can also be applied for the pre-training and ne-tuning-based transfer learning task [41] , does CpRec work as well as the noncompressed model for such a task  ... 
arXiv:2004.13139v5 fatcat:7hiwmkmwpngm7nq7uyrbsfvyom

Improving Sequential Recommendation Consistency with Self-Supervised Imitation [article]

Xu Yuan, Hongshen Chen, Yonghao Song, Xiaofang Zhao, Zhuoye Ding, Zhen He, Bo Long
2021 arXiv   pre-print
Precisely, we extract the consistency knowledge by utilizing three self-supervised pre-training tasks, where temporal consistency and persona consistency capture user-interaction dynamics in terms of the  ...  As a result, the sequential recommender is prone to make inconsistent predictions.  ...  Consistency-enhanced pre-training models serve as teachers, and a sequential recommendation model is treated as a student.  ... 
arXiv:2106.14031v2 fatcat:ze3mhsukpzfw5pk6oru34dpzu4

Self-supervised Learning for Large-scale Item Recommendations [article]

Tiansheng Yao, Xinyang Yi, Derek Zhiyuan Cheng, Felix Yu, Ting Chen, Aditya Menon, Lichan Hong, Ed H. Chi, Steve Tjoa, Jieqi Kang, Evan Ettinger
2021 arXiv   pre-print
We evaluate our framework using two real-world datasets with 500M and 1B training examples respectively.  ...  To model the input space with large-vocab categorical features, a typical recommender model learns a joint embedding space through neural networks for both queries and items from user feedback data.  ...  In recommender systems, a line of research has been recently studied for utilizing self-supervised learning for sequential recommendation.  ... 
arXiv:2007.12865v4 fatcat:euu7phtharckdbwki3cfceqmq4

Category-Aware Location Embedding for Point-of-Interest Recommendation

Hossein A. Rahmani, Mohammad Aliannejadi, Rasoul Mirzaei Zadeh, Mitra Baratchi, Mohsen Afsharchi, Fabio Crestani
2019 Proceedings of the 2019 ACM SIGIR International Conference on Theory of Information Retrieval - ICTIR '19  
With the recent advances of neural models, much work has sought to leverage neural networks to learn neural embeddings in a pre-training phase that achieve an improved representation of POIs and consequently  ...  a better recommendation.  ...  This model takes our pre-trained POI embeddings and learns user embeddings to recommend top k recommendation list.  ... 
doi:10.1145/3341981.3344240 dblp:conf/ictir/RahmaniAZBAC19 fatcat:4fsj5ughivblbgvvtzseu7uupi

StackRec: Efficient Training of Very Deep Sequential Recommender Models by Layer Stacking [article]

Jiachun Wang, Fajie Yuan, Jian Chen, Qingyao Wu, Chengmin Li, Min Yang, Yang Sun, Guoxiao Zhang
2020 arXiv   pre-print
Deep learning has brought great progress for the sequential recommendation (SR) tasks.  ...  Enlightened by this, we propose progressively stacking such pre-trained residual layers/blocks so as to yield a deeper but easier-to-train SR model.  ...  . • ML20: It was provided by MovieLens 2 , which is widely adopted for both non-sequential and sequential recommendations [10, 18, 19, 21] .  ... 
arXiv:2012.07598v1 fatcat:3kvpwbcasvdfxbqh4eiokhwp2e

Knowledge Transfer via Pre-training for Recommendation: A Review and Prospect

Zheni Zeng, Chaojun Xiao, Yuan Yao, Ruobing Xie, Zhiyuan Liu, Fen Lin, Leyu Lin, Maosong Sun
2021 Frontiers in Big Data  
In this survey, we first provide a review of recommender systems with pre-training. In addition, we show the benefits of pre-training to recommender systems through experiments.  ...  Finally, we discuss several promising directions for future research of recommender systems with pre-training. The source code of our experiments will be available to facilitate future research.  ...  In contrast, model transfer achieves much better performance for deep BERT4Rec model.  ... 
doi:10.3389/fdata.2021.602071 pmid:33817631 pmcid:PMC8013982 fatcat:oz2da4xwz5ad3meqciozon2teq

Deep Learning for Sequential Recommendation: Algorithms, Influential Factors, and Evaluations [article]

Hui Fang, Danning Zhang, Yiheng Shu, Guibing Guo
2020 arXiv   pre-print
However, there is little systematic study on DL-based methods, especially regarding to how to design an effective DL model for sequential recommendation.  ...  In this view, this survey focuses on DL-based sequential recommender systems by taking the aforementioned issues into consideration.  ...  However, in the sequential recommendation, it is rather challenging to pre-train an embedding model (e.g., word2vec) as the item information and the dependencies relationship among items is constantly  ... 
arXiv:1905.01997v3 fatcat:i7hvdiqjpnaupcq2osrblttb4u
« Previous Showing results 1 — 15 out of 59,432 results