ERCIM News No.44 - January 2001 [contents]
by Carol Peters
More than twenty research groups participated actively in the first campaign of the Cross-Language Evaluation Forum (CLEF 2000). The results were presented at a Workshop in Lisbon, 21-22 September, immediately following the fourth European Conference on Digital Libraries - ECDL 2000.
The Cross-Language Evaluation Forum (CLEF) is an activity of the DELOS Network of Excellence for Digital Libraries. The goal of CLEF is to provide an evaluation infrastructure and benchmarking facilities for the testing and tuning of monolingual and cross-language information retrieval systems operating on European languages.
The first evaluation campaign offered four evaluation tasks designed to test multilingual, bilingual, domain-specific, and monolingual (non-English) IR systems. The main task consisted of querying a multilingual corpus of newspaper documents in four languages (English, French, German and Italian) and submitting the results in a ranked, merged list. Any one of eight European languages could be used to query this collection.
Eight North American and twelve European research groups, mainly from academia a few from industry, managed to submit results within the 1 July deadline. A total of 90 runs were received; runs were submitted for all tasks and for all topic languages. This was a very good result for the first year of activity, especially in consideration of the severe time constraints on participation.
The results of the activity were presented during a two-day workshop on Cross-Language Information Retrieval and Evaluation, 21-22 September, Lisbon, Portugal. The first day, attended by nearly sixty participants, was open to all those interested in the area of Cross-Language Information Retrieval (CLIR). The morning was dedicated to the presentation of invited papers and discussion sessions on CLIR research related issues. The afternoon addressed the topic of CLIR system evaluation: the current situation and future developments. The objective was to promote a discussion on What is needed to improve CLIR system performance and How can evaluation campaigns assist in this. The goal was to identify the actual contribution of evaluation to system development and to determine what could or should be done in the future to stimulate progress.
The second day was restricted to participants in the CLEF 2000 evaluation campaign. The results of the evaluation activity were presented and discussed in detail. Both traditional and innovative strategies had been adopted to address the CLIR task; various approaches had been experimented for query expansion and results merging. Preliminary papers describing the approaches adopted by the different groups in their experiments were published in the Working Notes printed by ERCIM as part of the DELOS Workshop series and distributed at the Workshop.
Details of the two-day Workshop, including slides of the presentations and copies of the Working Notes, can be found on the CLEF Web site. The Proceedings will be published by Springer in their Lecture Notes for Computer Science series. The volume will include a record of the talks given on Day 1 and of the experiments and results of the CLEF evaluation campaign presented on Day 2. All papers will be revised and extended with respect to the preliminary draft version
The programme and schedule for CLEF 2001 is to be found on the CLEF Web site. The agenda will be similar to that for 2000 but with the addition of more languages to the multilingual text collection (Spanish, Dutch and perhaps Greek).
Links:
http://www.clef-campaign.org/
Please contact:
Carol Peters - IEI-CNR (CLEF Coordinator)
Tel: +39 050 315 2897
E-mail: carol@iei.pi.cnr.it