Attention over Self-attention:Intention-aware Re-ranking with Dynamic Transformer Encoders for Recommendation [article]

Zhuoyi Lin, Sheng Zang, Rundong Wang, Zhu Sun, J.Senthilnath, Chi Xu, Chee-Keong Kwoh
2022 arXiv   pre-print
Re-ranking models refine item recommendation lists generated by the prior global ranking model, which have demonstrated their effectiveness in improving the recommendation quality. However, most existing re-ranking solutions only learn from implicit feedback with a shared prediction model, which regrettably ignore inter-item relationships under diverse user intentions. In this paper, we propose a novel Intention-aware Re-ranking Model with Dynamic Transformer Encoder (RAISE), aiming to perform
more » ... ser-specific prediction for each individual user based on her intentions. Specifically, we first propose to mine latent user intentions from text reviews with an intention discovering module (IDM). By differentiating the importance of review information with a co-attention network, the latent user intention can be explicitly modeled for each user-item pair. We then introduce a dynamic transformer encoder (DTE) to capture user-specific inter-item relationships among item candidates by seamlessly accommodating the learned latent user intentions via IDM. As such, one can not only achieve more personalized recommendations but also obtain corresponding explanations by constructing RAISE upon existing recommendation engines. Empirical study on four public datasets shows the superiority of our proposed RAISE, with up to 13.95%, 9.60%, and 13.03% relative improvements evaluated by Precision@5, MAP@5, and NDCG@5 respectively.
arXiv:2201.05333v2 fatcat:z7c7rbfn6bdvbdjrx6bvw2xk6i