Filters








20 Hits in 3.7 sec

280 Birds with One Stone: Inducing Multilingual Taxonomies from Wikipedia using Character-level Classification [article]

Amit Gupta and Rémi Lebret and Hamza Harkous and Karl Aberer
2017 arXiv   pre-print
We propose a simple, yet effective, approach towards inducing multilingual taxonomies from Wikipedia.  ...  Given an English taxonomy, our approach leverages the interlanguage links of Wikipedia followed by character-level classifiers to induce high-precision, high-coverage taxonomies in other languages.  ...  We now describe our approach for inducing multilingual taxonomies from the WCN.  ... 
arXiv:1704.07624v2 fatcat:ktj5emgtrjf5zmfwhknrxdnekq

MENTA

Gerard de Melo, Gerhard Weikum
2010 Proceedings of the 19th ACM international conference on Information and knowledge management - CIKM '10  
This results in MENTA (Multilingual Entity Taxonomy), a resource that describes 5.4 million entities and is presumably the largest multilingual lexical knowledge base currently available.  ...  So far, however, the multilingual nature of Wikipedia has largely been neglected.  ...  By aggregating from multiple editions of Wikipedia, we are able to construct MENTA -Multilingual Entity Taxonomy -a large-scale taxonomic knowledge base that covers a significantly greater range of entities  ... 
doi:10.1145/1871437.1871577 dblp:conf/cikm/MeloW10 fatcat:hiylsetiifejnfos3ggy2a7pli

Taxonomic data integration from multilingual Wikipedia editions

Gerard de Melo, Gerhard Weikum
2013 Knowledge and Information Systems  
This results in MENTA (Multilingual Entity Taxonomy), a resource that describes 5.4 million entities and is one of the largest multilingual lexical knowledge bases currently available.  ...  This paper investigates how entities from all editions of Wikipedia as well as WordNet can be integrated into a single coherent taxonomic class hierarchy.  ...  Algorithm 4.1 captures the steps taken to induce the taxonomy. Input.  ... 
doi:10.1007/s10115-012-0597-3 fatcat:xvtkrbpmkncetnmxgv6bs7dqza

Two Is Bigger (and Better) Than One: the Wikipedia Bitaxonomy Project

Tiziano Flati, Daniele Vannella, Tommaso Pasini, Roberto Navigli
2014 Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)  
We present WiBi, an approach to the automatic creation of a bitaxonomy for Wikipedia, that is, an integrated taxonomy of Wikipage pages and categories.  ...  We leverage the information available in either one of the taxonomies to reinforce the creation of the other taxonomy.  ...  Phase 1: Inducing the Page Taxonomy The goal of the first phase is to induce a taxonomy of Wikipedia pages.  ... 
doi:10.3115/v1/p14-1089 dblp:conf/acl/FlatiVPN14 fatcat:nm72xkn64vf6zl2u743b4ayhne

Unsupervised learning of an extensive and usable taxonomy for DBpedia

Marco Fossati, Dimitris Kontokostas, Jens Lehmann
2015 Proceedings of the 11th International Conference on Semantic Systems - SEMANTICS '15  
We present an unsupervised approach that automatically learns a taxonomy from the Wikipedia category system and extensively assigns types to DBpedia entities, through the combination of several interdisciplinary  ...  The DBpedia project transforms Wikipedia content into RDF and currently plays a crucial role in the Web of Data as a central multilingual interlinking hub.  ...  MENTA [5] is a massive lexical knowledge base, with data coming from 271 languages.  ... 
doi:10.1145/2814864.2814881 dblp:conf/i-semantics/FossatiKL15 fatcat:l4zut6r5rndbbgqjjhtyijnsnq

Collaboratively built semi-structured content and Artificial Intelligence: The story so far

Eduard Hovy, Roberto Navigli, Simone Paolo Ponzetto
2013 Artificial Intelligence  
These were explored for the mapping task by de Melo and Weikum in the construction of MENTA [114] , a multilingual taxonomy which integrates WordNet and Wikipedia (Section 7).  ...  The same authors later presented an extension, named MENTA [114] , which consists of a large-scale taxonomy of named entities and their classes also built from WordNet and Wikipedia.  ... 
doi:10.1016/j.artint.2012.10.002 fatcat:mwk5o254urb2dejsh7c224uu3q

BabelNet: The automatic construction, evaluation and application of a wide-coverage multilingual semantic network

Roberto Navigli, Simone Paolo Ponzetto
2012 Artificial Intelligence  
Key to our approach is the integration of lexicographic and encyclopedic knowledge from WordNet and Wikipedia.  ...  We present an automatic approach to the construction of BabelNet, a very large, widecoverage multilingual semantic network.  ...  The same authors later present in [72] a methodology for building MENTA, a multilingual taxonomy containing 5.4 million entities, which is also built from WordNet and Wikipedia.  ... 
doi:10.1016/j.artint.2012.07.001 fatcat:m5lt7m6mhfevvauj5lofw3mqcu

A Combined Approach for Eliciting Relationships for Educational Ontologies Using General-Purpose Knowledge Bases

Angel Conde, Mikel Larranaga, Ana Arruarte, Jon A. Elorriaga
2019 IEEE Access  
LiReWi combines grammar-based, co-occurrence-based and taxonomy-based methods together with several knowledge bases, such as Wikipedia, WordNet, WikiTaxonomy, WibiTaxonomy, and WikiRelations to elicit  ...  This paper presents LiReWi, a system for the elicitation of relationships for educational ontologies from electronic textbooks.  ...  MENTA [43] is a multilingual taxonomy derived from Wikipedia. Unlike previous approaches, it was also built by analyzing Wikipedia for languages other than just English.  ... 
doi:10.1109/access.2019.2910079 fatcat:4ix3y3pm7netflrraeh2m77dd4

Rapid Induction of Multiple Taxonomies for Enhanced Faceted Text Browsing

Lawrence Muchemi, Gregory Grefenstette
2016 International Journal of Artificial Intelligence & Applications  
In this paper we present and compare two methodologies for rapidly inducing multiple subject-specific taxonomies from crawled data.  ...  We also perform a comprehensive corpus based evaluation of the algorithms based on many datasets drawn from the fields of medicine (diseases) and leisure (hobbies) and show that the induced taxonomies  ...  , as in MENTA [10] .  ... 
doi:10.5121/ijaia.2016.7401 fatcat:hk2ea4ahhza7dmwlpjz4ve34ou

Taxonomy Induction from Chinese Encyclopedias by Combinatorial Optimization [chapter]

Weiming Lu, Renjie Lou, Hao Dai, Zhenyu Zhang, Shansong Yang, Baogang Wei
2015 Lecture Notes in Computer Science  
The experimental results show that our approach can construct a practicable taxonomy from Chinese encyclopedias.  ...  In this paper, we propose a taxonomy induction approach from a Chinese encyclopedia by using combinatorial optimizations.  ...  MENTA [30] induced multilingual taxonomies from all editions of Wikipedia and WordNet. The closely-related previous works are WiBi [31] and [16] .  ... 
doi:10.1007/978-3-319-25207-0_25 fatcat:icp3lmbkprcjjl2q7nbcm4hu2m

Transforming Wikipedia into a large scale multilingual concept network

Vivi Nastase, Michael Strube
2013 Artificial Intelligence  
This paper describes an approach to deriving such a large scale and multilingual resource by exploiting several facets of the on-line encyclopedia Wikipedia.  ...  A knowledge base for real-world language processing applications should consist of a large base of facts and reasoning mechanisms that combine them to induce novel and more complex information.  ...  MENTA [6] is a multilingual extension of the core YAGO knowledge base.  ... 
doi:10.1016/j.artint.2012.06.008 fatcat:eknwelhsj5deroeno7pg3nb6va

TiFi: Taxonomy Induction for Fictional Domains [Extended version] [article]

Cuong Xuan Chu, Simon Razniewski, Gerhard Weikum
2019 arXiv   pre-print
Taxonomies are important building blocks of structured knowledge bases, and their construction from text sources and Wikipedia has received much attention.  ...  In this paper we focus on the construction of taxonomies for fictional domains, using noisy category systems from fan wikis or text extraction as input.  ...  MENTA [8] learns a model to map Wikipedia categories to WordNet, with the goal of constructing a multilingual taxonomy over both.  ... 
arXiv:1901.10263v1 fatcat:rhgd5i6qhvgonc42uaz7lsbuny

Graph-based methods for large-scale multilingual knowledge integration [article]

Gerard De Melo, Universität Des Saarlandes, Universität Des Saarlandes
2011
Together, these methods can be used to produce a large-scale multilingual knowledge base semantically describing over 5 million entities and over 16 million natural language words and names in more than  ...  Information taken from other sources or indirectly adopted data and concepts are explicitly acknowledged with references to the respective sources.  ...  In this chapter, we aggregate from multiple editions of Wikipedia as well as WordNet to construct MENTA -Multilingual Entity Taxonomy -a large-scale taxonomic knowledge base that covers a significantly  ... 
doi:10.22028/d291-26102 fatcat:ybmbgjpkmjhk5naexrwmns44n4

Knowledge extraction from fictional texts [article]

Cuong Xuan Chu, Universität Des Saarlandes
2022
Knowledge extraction from text is a key task in natural language processing, which involves many sub-tasks, such as taxonomy induction, named entity recognition and typing, relation extraction, knowledge  ...  However, current knowledge extraction methods mostly focus on prominent real world entities with Wikipedia and mainstream news articles as sources.  ...  MENTA [de Melo and Weikum, 2010 ] learns a model to map Wikipedia categories to WordNet, with the goal of constructing a multilingual taxonomy over both.  ... 
doi:10.22028/d291-36107 fatcat:hdzlcbxc5ngr3hbeotkcdpbjqm

Semantic Annotation and Search: Bridging the Gap between Text, Knowledge and Language

Lei Zhang
2018
Nowadays, more and more people from different countries are connecting to the Internet, in particular the Web, and many users can understand more than one language.  ...  Within the context of globalization, multilingual and cross-lingual access to information has emerged as an issue of major interest.  ...  Its extension MENTA (de Melo & Weikum, 2010) adds a large scale hierarchical taxonomy containing 5.4 million named entities and their classes, which is also built from WordNet and Wikipedia.  ... 
doi:10.5445/ir/1000081159 fatcat:qa6ti6ynqrajbpuhntlm7k367a
« Previous Showing results 1 — 15 out of 20 results