A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
A Multi-task Learning Approach for Improving Product Title Compression with User Search Log Data
[article]
2018
arXiv
pre-print
This paper proposes a novel multi-task learning approach for improving product title compression with user search log data. ...
It is a challenging and practical research problem to obtain effective compression of lengthy product titles for E-commerce. ...
Ding Liu with Alibaba Group for their valuable discussions and the anonymous reviewers for their helpful comments. ...
arXiv:1801.01725v1
fatcat:upfviymp6rhnxepfna7quuwa2m
A Multi-Task Learning Approach for Improving Product Title Compression with User Search Log Data
2018
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
This paper proposes a novel multi-task learning approach for improving product title compression with user search log data. ...
It is a challenging and practical research problem to obtain effective compression of lengthy product titles for E-commerce. ...
Ding Liu with Alibaba Group for their valuable discussions and the anonymous reviewers for their helpful comments. ...
doi:10.1609/aaai.v32i1.11264
fatcat:fxpyisq32jbcljmw54j47272nu
Multi-Modal Generative Adversarial Network for Short Product Title Generation in Mobile E-Commerce
2019
Proceedings of the 2019 Conference of the North
In this paper, we propose a Multi-Modal Generative Adversarial Network (MM-GAN) for short product title generation in E-Commerce, which innovatively incorporates image information and attribute tags from ...
MM-GAN poses short title generation as a reinforcement learning process, where the generated titles are evaluated by the discriminator in a human-like view. ...
(c) Agreement-based MTL (Agree-MTL) which is a multi-task learning approach to improve product title compression with user searching log data. ...
doi:10.18653/v1/n19-2009
dblp:conf/naacl/ZhangZLWPGY19
fatcat:ecjx3ohbkjfrlfxtipuagk6c7q
Multi-Modal Generative Adversarial Network for Short Product Title Generation in Mobile E-Commerce
[article]
2019
arXiv
pre-print
In this paper, we propose a Multi-Modal Generative Adversarial Network (MM-GAN) for short product title generation in E-Commerce, which innovatively incorporates image information and attribute tags from ...
MM-GAN poses short title generation as a reinforcement learning process, where the generated titles are evaluated by the discriminator in a human-like view. ...
(c) Agreement-based MTL (Agree-MTL) which is a multi-task learning approach to improve product title compression with user searching log data. ...
arXiv:1904.01735v1
fatcat:cncharlchjaozcosktftf2d7aa
Product Title Refinement via Multi-Modal Generative Adversarial Learning
[article]
2018
arXiv
pre-print
In this paper, we propose a Multi-Modal Generative Adversarial Network (MM-GAN) for short product title generation, which innovatively incorporates image information, attribute tags from the product and ...
MM-GAN treats short titles generation as a reinforcement learning process, where the generated titles are evaluated by the discriminator in a human-like view. ...
(c) Agreement-based MTL (Agree-MTL) [16] which is a multi-task learning approach to improve product title compression with user searching log data. ...
arXiv:1811.04498v1
fatcat:prwxsoe7g5c3tebvkkx633hdvq
User Multi-Interest Modeling for Behavioral Cognition
[article]
2022
arXiv
pre-print
With the help of a novel attention module which can learn multi-interests of user, the second sub-module achieves almost lossless dimensionality reduction. ...
Representation modeling based on user behavior sequences is an important direction in user cognition. In this study, we propose a novel framework called Multi-Interest User Representation Model. ...
For each user, the reviewed product titles constitute a sequence of review behaviors. ...
arXiv:2110.11337v3
fatcat:siuxbxijvbaylmcoyb5nyqada4
Interest-oriented Universal User Representation via Contrastive Learning
[article]
2021
arXiv
pre-print
It provides a unified framework that allows for long-term or short-term interest representation learning in a data-driven manner. Moreover, a novel multi-interest extraction module is presented. ...
Universal user representation has received many interests recently, with which we can be free from the cumbersome work of training a specific model for each downstream application. ...
For each user, the reviewed product titles make up a review behavior sequence. ...
arXiv:2109.08865v2
fatcat:ej5tisxhfjhslmqw4z5nuztfoq
M6-Rec: Generative Pretrained Language Models are Open-Ended Recommender Systems
[article]
2022
arXiv
pre-print
settings' data and can minimize the carbon footprint by avoiding training a separate model from scratch for every task. ...
The mainstream approach so far is to develop individual algorithms for each domain and each task. ...
We explore two use cases here: (i) generating search queries for recommending to a user, and (ii) generating new product titles based on user behaviors which tell us what types of products may be popular ...
arXiv:2205.08084v2
fatcat:p645he3l7zht3dlrxngo5qcecq
Scaling Law for Recommendation Models: Towards General-purpose User Representations
[article]
2022
arXiv
pre-print
We demonstrate that the scaling law is present in user representation learning areas, where the training error scales as a power-law with the amount of computation. ...
Here we explore the possibility of general-purpose user representation learning by training a universal user encoder at large scales. ...
We would also like to thank the NAVER Smart Machine Learning (NSML) platform team (Sung et al., 2017; Kim et al., 2018) for their critical work on the software and hardware infrastructure on which all ...
arXiv:2111.11294v3
fatcat:ywcd5hvfbfa3djdj5wmapdexym
AutoADR: Automatic Model Design for Ad Relevance
[article]
2020
arXiv
pre-print
Specifically, AutoADR leverages a one-shot neural architecture search algorithm to find a tailored network architecture for Ad Relevance. ...
We add the model designed by AutoADR as a sub-model into the production Ad Relevance model. ...
Before applying this architecture to production model, we use a large-scale real-world dataset collected from Microsoft Bing search log to retrain it for further improvement. ...
arXiv:2010.07075v1
fatcat:jhvzasllj5dujggc5to64rir2m
Pre-trained Language Model for Web-scale Retrieval in Baidu Search
[article]
2021
arXiv
pre-print
., ERNIE) can largely improve the usability and applicability of our search engine. ...
In particular, we developed an ERNIE-based retrieval model, which is equipped with 1) expressive Transformer-based semantic encoders, and 2) a comprehensive multi-stage training paradigm. ...
Specifically, we collect one-month (i.e., tens of billions of) user search logs for post-pretraining. ...
arXiv:2106.03373v4
fatcat:bkaz3q5dlrcr7nuxdrcrmgu3ti
Probing Product Description Generation via Posterior Distillation
[article]
2021
arXiv
pre-print
In product description generation (PDG), the user-cared aspect is critical for the recommendation system, which can not only improve user's experiences but also obtain more clicks. ...
Finally, we apply a Transformer-based decoding phase with copy mechanism to automatically generate the product description. ...
Acknowledgements The authors would like to thank Hengyi Cai from Institute of Computing Technology, Chinese Academy of Sciences, and the anonymous reviewers for their constructive comments and suggestions ...
arXiv:2103.01594v1
fatcat:3x73nvv2fbho3pweeoru3zk54q
ACE-BERT: Adversarial Cross-modal Enhanced BERT for E-commerce Retrieval
[article]
2021
arXiv
pre-print
product including title and image in a common subspace. ...
These multiple modalities are significant for a retrieval system while providing attracted products for customers. ...
A example is shown in Fig. 1(a) . When a user is looking for "red dress", a product (in the dotted bounding boxes) with title containing "red dress" is presented to the user. ...
arXiv:2112.07209v1
fatcat:fa3fvsvgojeopginqgdijxt3aq
e-CLIP: Large-Scale Vision-Language Representation Learning in E-commerce
[article]
2022
arXiv
pre-print
As a backbone for online shopping platforms and inspired by the recent success in representation learning research, we propose a contrastive learning framework that aligns language and visual models using ...
Understanding vision and language representations of product content is vital for search and recommendation applications in e-commerce. ...
accuracy via multi-task learning, and usage as a transferable backbone model for a new downstream task [1, 2, 47] . ...
arXiv:2207.00208v1
fatcat:wotusc7h2rd2lhzvel3hwq7yqm
Learning Fast Matching Models from Weak Annotations
[article]
2019
arXiv
pre-print
and weakly annotated search log data. ...
According to our experiments, compared with the baseline that directly learns from relevance labels, training by the proposed framework outperforms it by a large margin, and improves data efficiency substantially ...
A brief description of the product is usually displayed as the title of the LP, called LP title. ...
arXiv:1901.10710v3
fatcat:dcjarvjsnzfkdky5xesgebludu
« Previous
Showing results 1 — 15 out of 4,861 results