10,941 Hits in 3.9 sec

Large-scale learning of word relatedness with constraints

Guy Halawi, Gideon Dror, Evgeniy Gabrilovich, Yehuda Koren
2012 Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '12  
We propose a large-scale data mining approach to learning word-word relatedness, where known pairs of related words impose constraints on the learning process.  ...  Prior work on computing semantic relatedness of words focused on representing their meaning in isolation, effectively disregarding inter-word affinities.  ...  Importantly, explicit modeling of word relatedness allows us to learn a signal that is complementary to that learned from large-scale world knowledge repositories, which were used in the previous state  ... 
doi:10.1145/2339530.2339751 dblp:conf/kdd/HalawiDGK12 fatcat:k6qgovvuczfahcpfawalnjtqgi

A Lexical Resource-Constrained Topic Model for Word Relatedness

Yongjing Yin, Jiali Zeng, Hongji Wang, Keqing Wu, Bin Luo, Jinsong Su
2019 IEEE Access  
Our model is an extension of probabilistic latent semantic analysis, which automatically learns word-level distributed representations forward relatedness measurement.  ...  The experimental results in different languages demonstrate the effectiveness of our model in topic extraction and word relatedness measurement.  ...  similarity as [31] did, deriving the distributional vectors of words from a large corpus to measure the relatedness of words based on the similarity of their context.  ... 
doi:10.1109/access.2019.2909104 fatcat:vkfuxqyjy5hx3hume4k3pzlxt4

Deep learning of knowledge graph embeddings for semantic parsing of Twitter dialogs

Larry Heck, Hongzhao Huang
2014 2014 IEEE Global Conference on Signal and Information Processing (GlobalSIP)  
A deep neural network approach known as Deep Structured Semantic Modeling (DSSM) is used to scale the approach to learn neural embeddings for all of the concepts (pages) of Wikipedia.  ...  This paper presents a novel method to learn neural knowledge graph embeddings. The embeddings are used to compute semantic relatedness in a coherence-based semantic parser.  ...  Concept linking is completed with no constraints on the domain or topic of conversation. This task is often referred to as large-scale open domain entity linking [19] . III.  ... 
doi:10.1109/globalsip.2014.7032187 dblp:conf/globalsip/HeckH14 fatcat:hrospi7f5ncmngv4p4bzuflhe4

Training and Evaluating Multimodal Word Embeddings with Large-scale Web Annotated Images [article]

Junhua Mao, Jiajing Xu, Yushi Jing, Alan Yuille
2016 arXiv   pre-print
This dataset is more than 200 times larger than MS COCO, the standard large-scale image dataset with sentence descriptions.  ...  More specifically, we introduce a large-scale dataset with 300 million sentences describing over 40 million images crawled and downloaded from publicly available Pins (i.e. an image with sentence descriptions  ...  We appreciate the comments and suggestions from anonymous reviewers of NIPS 2016.  ... 
arXiv:1611.08321v1 fatcat:z3pxvpbxgbenvjocgklfsic4qa

Rapid L2 Word Learning through High Constraint Sentence Context: An Event-Related Potential Study

Baoguo Chen, Tengfei Ma, Lijuan Liang, Huanhuan Liu
2017 Frontiers in Psychology  
For each novel word, there were four high constraint sentences with the critical word at the end of the sentence.  ...  on L2 contextual word learning.  ...  ACKNOWLEDGMENTS This work was supported by funding from Beijing Education Science Planning of 13th Five-Year(CADA17077, The mechanism of second language word learning for Chinese-English bilinguals) for  ... 
doi:10.3389/fpsyg.2017.02285 pmid:29375420 pmcid:PMC5770742 fatcat:vzibivf6zbbapkq6ihkhfs66ji

Effects of Contextual Constraint on the Processing of Polysemous Words in Japanese EFL Reading

2014 Kanto Koshin'etsu Eigo Kyoiku Gakkaishi  
This study aims to examine the effect of contextual constraint on the processing of polysemous words in Japanese EFL reading.  ...  Taken together, these findings suggest that the role of dominant meanings in the processing of unfamiliar subordinate meanings of polysemous words differs according to the strength of the contextual constraint  ...  Overview of This Study As mentioned above, a large number of L2 studies have identified factors affecting the interpretation of polysemous words.  ... 
doi:10.20806/katejournal.28.0_55 fatcat:kp7yymryojf7nawfqxqr5o3s34

Semantic Sort: A Supervised Approach to Personalized Semantic Relatedness [article]

Ran El-Yaniv, David Yanay
2013 arXiv   pre-print
We present the results of extensive range of experiments from small to large scale, indicating that the proposed method is effective and competitive with the state-of-the-art.  ...  We present an efficient algorithm for learning such semantic models from a training sample of relatedness preferences.  ...  Each of these datasets consists of a list of word pairs, along with their numerical relatedness score.  ... 
arXiv:1311.2252v1 fatcat:irbunw2stfc6lm3s7fdtsja3uu

Large Scale Semi-Supervised Object Detection Using Visual and Semantic Knowledge Transfer

Yuxing Tang, Josiah Wang, Boyang Gao, Emmanuel Dellandrea, Robert Gaizauskas, Liming Chen
2016 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
Deep CNN-based object detection systems have achieved remarkable success on several large-scale object detection benchmarks.  ...  However, training such detectors requires a large number of labeled bounding boxes, which are more difficult to obtain than image-level annotations.  ...  [14] propose a Large Scale Detection through Adaptation (LSDA) algorithm that learns the difference between the CNN parameters of the image classifier and object detector of a "fully labeled" category  ... 
doi:10.1109/cvpr.2016.233 dblp:conf/cvpr/TangWGDGC16 fatcat:bbz6v5uyw5d6zbssua5ki6lbwi

A Preliminary Evaluation of the Impact of Syntactic Structure in Semantic Textual Similarity and Semantic Relatedness Tasks

Ngoc Phuoc An Vo, Octavian Popescu
2015 Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop  
The well related tasks of evaluating the Semantic Textual Similarity and Semantic Relatedness have been under a special attention in NLP community.  ...  Many different approaches have been proposed, implemented and evaluated at different levels, such as lexical similarity, word/string/POS tags overlapping, semantic modeling (LSA, LDA), etc.  ...  for computing the semantic similarity/relatedness scores between given sentence pairs.  ... 
doi:10.3115/v1/n15-2009 dblp:conf/naacl/VoP15 fatcat:dzew5q6hbjdxdgwcou43terrlm

Learning to Discover Subsumptions between Software Engineering Concepts in Wikipedia

Xiang Dong, Kai Chen, Jiangang Zhu, Beijun Shen
2016 Proceedings of the 28th International Conference on Software Engineering and Knowledge Engineering  
Wikipedia contains large-scale concepts and rich semantic information. A number of knowledge base construction projects such as WikiTaxonomy, DBpedia, and YAGO have acquired data from Wikipedia.  ...  And secondly, we design a machine learning based algorithm with some novel features to calculate the semantic relevancy between concepts.  ...  The experimental results demonstrate the large-scale and high accuracy of our dataset.  ... 
doi:10.18293/seke2016-021 dblp:conf/seke/DongCZS16 fatcat:f226amaa35eafndsk4yksktmhy

Expanding Taxonomies with Implicit Edge Semantics

Emaad Manzoor, Rui Li, Dhananjay Shrouty, Jure Leskovec
2020 Proceedings of The Web Conference 2020  
Arborist learns latent representations of the edge semantics along with embeddings of the taxonomy nodes to measure taxonomic relatedness between node pairs.  ...  Curated taxonomies enhance the performance of machine-learning systems via high-quality structured knowledge. However, manually curating a large and rapidly-evolving taxonomy is infeasible.  ...  Arborist learns to measure the taxonomic relatedness s(q, v) of a query-node pair with feature-vectors e q and e v .  ... 
doi:10.1145/3366423.3380271 dblp:conf/www/ManzoorLSL20 fatcat:omax2qru7bgofctkapferot7gi

Learning Word Relatedness over Time [article]

Guy D. Rosin, Eytan Adar, Kira Radinsky
2017 arXiv   pre-print
The model supports the task of identifying, given two words, when they relate to each other.  ...  However, many corpora today reflect significant longitudinal collections ranging from 20 years of the Web to hundreds of years of digitized newspapers and books.  ...  In our work, we focus on learning relatedness of words over time. We evaluate the technique using a large scale analysis showing its prediction accuracy.  ... 
arXiv:1707.08081v2 fatcat:drzwy42ybvcv5iwbnoupmly2ve

Leveraging Deep Neural Networks and Knowledge Graphs for Entity Disambiguation [article]

Hongzhao Huang and Larry Heck and Heng Ji
2015 arXiv   pre-print
The DSRM is directly trained on large-scale KGs and it maps heterogeneous types of knowledge of an entity from KGs to numerical feature vectors in a latent space such that the distance between two semantically-related  ...  Compared with the state-of-the-art relatedness approach proposed by (Milne and Witten, 2008a), the DSRM obtains 19.4% and 24.5% reductions in entity disambiguation errors on two publicly available datasets  ...  This work sheds light on exploring and modeling large-scale KGs with deep learning techniques for entity disambiguation and other NLP tasks.  ... 
arXiv:1504.07678v1 fatcat:ykfpeyk3ujg7vghjeg6krabe2i

Entity Hierarchy Embedding

Zhiting Hu, Poyao Huang, Yuntian Deng, Yingkai Gao, Eric Xing
2015 Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)  
We propose a principled framework of embedding entities that integrates hierarchical information from large-scale knowledge bases.  ...  The novel embedding model associates each category node of the hierarchy with a distance metric.  ...  For instance, word and phrase embeddings are largely induced from plain text.  ... 
doi:10.3115/v1/p15-1125 dblp:conf/acl/HuHDGX15 fatcat:3oicakbx3na2boqeomzjjdgjlu

Neural Sequence Learning Models for Word Sense Disambiguation

Alessandro Raganato, Claudio Delli Bovi, Roberto Navigli
2017 Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing  
The model supports the task of identifying, given two words, when they relate to each other.  ...  However, many corpora today reflect significant longitudinal collections ranging from 20 years of the Web to hundreds of years of digitized newspapers and books.  ...  In our work, we focus on learning relatedness of words over time. We evaluate the technique using a large scale analysis showing its prediction accuracy.  ... 
doi:10.18653/v1/d17-1120 dblp:conf/emnlp/RaganatoBN17 fatcat:ybd4odlt3zbkbnwlomhcvnbkrm
« Previous Showing results 1 — 15 out of 10,941 results