Filters








2 Hits in 0.97 sec

UserBERT: Contrastive User Model Pre-training [article]

Chuhan Wu, Fangzhao Wu, Yang Yu, Tao Qi, Yongfeng Huang, Xing Xie
2021 arXiv   pre-print
In this paper, we propose a contrastive user model pre-training method named UserBERT.  ...  Two self-supervision tasks are incorporated in UserBERT for user model pre-training on unlabeled user behavior data to empower user modeling.  ...  In this paper, we propose a contrastive user model pre-training method named UserBERT. 1 In our approach, we pre-train the user models in two contrastive self-supervision tasks.  ... 
arXiv:2109.01274v1 fatcat:64ijwcctergzjotk3f4ch2hxca

UserBERT: Modeling Long- and Short-Term User Preferences via Self-Supervision [article]

Tianyu Li, Ali Cevahir, Derek Cho, Hao Gong, DuyKhuong Nguyen, Bjorn Stenger
2022 arXiv   pre-print
This paper extends the BERT model to e-commerce user data for pre-training representations in a self-supervised manner.  ...  We propose methods for the tokenization of different types of user behavior sequences, the generation of input representation vectors, and a novel pretext task to enable the pre-trained model to learn  ...  ROC AUC metric vs. number of training epochs for different models on two different user attribute prediction tasks. Figure 5 : 5 Figure 5: Effect of pretraining on UserBERT.  ... 
arXiv:2202.07605v1 fatcat:eoom2j4c25emrpa4qnfmvka2gi