A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
UserBERT: Contrastive User Model Pre-training
[article]
2021
arXiv
pre-print
In this paper, we propose a contrastive user model pre-training method named UserBERT. ...
Two self-supervision tasks are incorporated in UserBERT for user model pre-training on unlabeled user behavior data to empower user modeling. ...
In this paper, we propose a contrastive user model pre-training method named UserBERT. 1 In our approach, we pre-train the user models in two contrastive self-supervision tasks. ...
arXiv:2109.01274v1
fatcat:64ijwcctergzjotk3f4ch2hxca
UserBERT: Modeling Long- and Short-Term User Preferences via Self-Supervision
[article]
2022
arXiv
pre-print
This paper extends the BERT model to e-commerce user data for pre-training representations in a self-supervised manner. ...
We propose methods for the tokenization of different types of user behavior sequences, the generation of input representation vectors, and a novel pretext task to enable the pre-trained model to learn ...
ROC AUC metric vs. number of training epochs for different models on two different user attribute prediction tasks.
Figure 5 : 5 Figure 5: Effect of pretraining on UserBERT. ...
arXiv:2202.07605v1
fatcat:eoom2j4c25emrpa4qnfmvka2gi