A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is
In this paper, we propose a contrastive user model pre-training method named UserBERT. ... Two self-supervision tasks are incorporated in UserBERT for user model pre-training on unlabeled user behavior data to empower user modeling. ... In this paper, we propose a contrastive user model pre-training method named UserBERT. 1 In our approach, we pre-train the user models in two contrastive self-supervision tasks. ...arXiv:2109.01274v1 fatcat:64ijwcctergzjotk3f4ch2hxca
This paper extends the BERT model to e-commerce user data for pre-training representations in a self-supervised manner. ... We propose methods for the tokenization of different types of user behavior sequences, the generation of input representation vectors, and a novel pretext task to enable the pre-trained model to learn ... ROC AUC metric vs. number of training epochs for different models on two different user attribute prediction tasks. Figure 5 : 5 Figure 5: Effect of pretraining on UserBERT. ...arXiv:2202.07605v1 fatcat:eoom2j4c25emrpa4qnfmvka2gi