1 Hit in 2.0 sec

VarCLR: Variable Semantic Representation Pre-training via Contrastive Learning [article]

Qibin Chen, Jeremy Lacomis, Edward J. Schwartz, Graham Neubig, Bogdan Vasilescu, Claire Le Goues
2021 arXiv   pre-print
We propose VarCLR, a new approach for learning semantic representations of variable names that effectively captures variable similarity in this stricter sense.  ...  Finally, we contribute a release of all data, code, and pre-trained models, aiming to provide a drop-in replacement for variable representations used in either existing or future program analyses that  ...  We present a novel method based on contrastive learning for pre-training variable representations.  ... 
arXiv:2112.02650v1 fatcat:psodxczgwvep5esu6cman4yfeu