A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Learning to Embed Semantic Correspondence for Natural Language Understanding
2018
Proceedings of the 22nd Conference on Computational Natural Language Learning
While learning embedding models has yielded fruitful results in several NLP subfields, most notably Word2Vec, embedding correspondence has relatively not been well explored especially in the context of natural language understanding (NLU), a task that typically extracts structured semantic knowledge from a text. A NLU embedding model can facilitate analyzing and understanding relationships between unstructured texts and their corresponding structured semantic knowledge, essential for both
doi:10.18653/v1/k18-1013
dblp:conf/conll/JungLK18
fatcat:uektak5jgzf3bj765chodczu3a