Conference v2.0: An Uncertain Version of the OAEI Conference Benchmark [chapter]

Michelle Cheatham, Pascal Hitzler
2014 Lecture Notes in Computer Science  
The Ontology Alignment Evaluation Initiative is a set of benchmarks for evaluating the performance of ontology alignment systems. In this paper we re-examine the Conference track of the OAEI, with a focus on the degree of agreement between the reference alignments within this track and the opinion of experts. We propose a new version of this benchmark that more closely corresponds to expert opinion and confidence on the matches. The performance of top alignment systems is compared on both
more » ... ns of the benchmark. Additionally, a general method for crowdsourcing the development of more benchmarks of this type using Amazon's Mechanical Turk is introduced and shown to be scalable, cost-effective and to agree well with expert opinion.
doi:10.1007/978-3-319-11915-1_3 fatcat:a6yj6q7sbfdktlfoad342xqs5u