A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Low-Bit Quantization for Attributed Network Representation Learning
2019
Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Attributed network embedding plays an important role in transferring network data into compact vectors for effective network analysis. Existing attributed network embedding models are designed either in continuous Euclidean spaces which introduce data redundancy or in binary coding spaces which incur significant loss of representation accuracy. To this end, we present a new Low-Bit Quantization for Attributed Network Representation Learning model (LQANR for short) that can learn compact node
doi:10.24963/ijcai.2019/562
dblp:conf/ijcai/YangP0Z019
fatcat:q4jwh3xq2fbd7fm2h7jnq3kcyi