Report on INEX 2009

T. Beckers, S. Geva, W.-C. Huang, T. Iofciu, J. Kamps, G. Kazai, M. Koolen, S. Kutty, M. Landoni, M. Lehtonen, V. Moriceau, P. Bellot (+17 others)
2010 SIGIR Forum  
INEX investigates focused retrieval from structured documents by providing large test collections of structured documents, uniform evaluation measures, and a forum for organizations to compare their results. This paper reports on the INEX 2009 evaluation campaign, which consisted of a wide range of tracks: Ad hoc, Book, Efficiency, Entity Ranking, Interactive, QA, Link the Wiki, and XML Mining. INEX in running entirely on volunteer effort by the IR research community: anyone with an idea and
more » ... e time to spend, can have a major impact! Ad hoc Track Investigating the effectiveness of XML-IR and Passage Retrieval for four ad hoc retrieval tasks (Thorough, Focused, Relevant in Context, Best in Context). Book Track Investigating techniques to support users in reading, searching, and navigating full texts of digitized books. Efficiency Track Investigating the trade-off between effectiveness and efficiency of ranked XML retrieval approaches on real data and real queries. Entity Ranking Track Investigating entity retrieval rather than text retrieval: 1) Entity Ranking, 2) Entity List Completion. Interactive Track (iTrack) Investigating the behavior of users when interacting with XML documents, and retrieval approaches which are effective in user-based environments. Question Answering Track Investigating how technology for accessing semi-structured data can be used to address interrogative information needs. Link-the-Wiki Track Investigating link discovery between Wikipedia documents, both at the file level and at the element level. XML-Mining Track Investigating structured document mining, especially the classification and clustering of semi-structured documents. In the rest of this paper, we discuss the aims and results of the INEX 2009 tracks in relatively self-contained sections: the Ad Hoc track (Section 2), the Book track (Section 3), the Efficiency track (Section 4), the Entity Ranking track (Section 5), the Interactive track (Section 6), the QA track (Section 7), the Link the Wiki track (Section 8), and the XML Mining track (Section 9). Ad Hoc Track In this section, we will briefly discuss the aims of the Ad Hoc track, its tasks and setup, the used measures and results, and try to formulate clear findings. Further details are in [3] .
doi:10.1145/1842890.1842897 fatcat:46evgkszirdm3grr6fqrtuyqtm