CAIR

Causality-driven Adhoc Information Retrieval
Forum for Information Retrieval Evaluation (FIRE) - 2020
December 16th-20th (Online Event)

Call for Participation

Why will you participate?

In contrast to a traditional search system, a Causal Search System seeks to retrieve documents that provide information on the likely causes leading to a query event. In this extended search system, in addition to the topically relevant ranked list of documents, the user will also be presented with a list of causally relevant documents. On submitted queries pertaining to an event (e.g. ‘drop of pound’ or ‘housing crisis’), the system then retrieves adequate information required to construct further analysis for the purpose of automated (or semi-automated with humans-in-loop) decision and policy making. Moreover, information extracted from causally related documents could also serve as the necessary explanations in order to support an automatically generated decision prescribing ways to eradicate a likely cause.

Read More

What to do?

Participants will be given a static test collection of documents and a list of queries related to events that are likely to be caused by a number of other past events. The participants are then required to develop ranking models that can effectively retrieve documents containing information on such past events that are likely candidates to lead to the query event. The officially submitted ranked lists of different participating systems will then be evaluated by comparing them against a set of manually judged relevant documents.

Repositories

Corpus

We provide you a static test collection of news articles constituting the official English ad hoc IR collection of FIRE. Access to the data will be provided on receipt of the below access form.

To have an access to the corpus, please email the complete and signed organisational-access form to both fire@isical.ac.in and cair.miners@gmail.com. Organisations may use the individual-access form to manage access rights internally and these individual access forms need not be sent to us.

Click here to download corpus

Train Topics

We will release a training set comprising 5 topics (with the relevance assessments) followed by 20 test topics to the participants. Each topic follows the standard TREC format, i.e., is comprised of a 'title' (usually a small number of keywords) and a 'narrative' (a paragraph describing the relevance criteria in details).

Download training topics here

Relevance Judgements

For the 5 training topics, we provide you binary relevance judgements following TREC format. This will help to analyze the causal relevance which you need to address (instead of the topical relevance). This will also enable you to tune prototype systems and explore a number of early approaches and with the help of the evaluation (using the provided manual assessments) to see what works and what does not.

Download judgements here

Related Research

If you are sans suitable reading resources, here is the preprint of our upcoming SIGIR'20 paper (to appear). Might be helpul, we believe.

Bulletins

Important Dates

Training Data Release 19th June, 2020
Test Data Release 30th July, 2020
Run Submission Deadline 15th August, 2020
Results Declaration 15th September, 2020
Working Note Submission 5th October, 2020
Review Notifications 20th October, 2020
Final Version of Working Note 5th November, 2020

Latest Updates

Training data released... find here

Guidelines

What Do You Submit?

Your proposed system must generate a 6 column .tsv file following the standard TREC format. In order to encourage the investigation of different kind of features, three runs per participating group are allowed.

How Do You Get Evaluated?

We will employ standard evaluation metrics, such as nDCG and MAP, to take into account both precision and recall (in the graded and binary cases respectively) of the submitted runs. Additionally, we will also rank systems based on precision alone using nDCG and P@5.

Know Where You Are?

To Be Published....Good Luck!!!!

Correspondence

May We Help You?

Debasis Ganguly, IBM Research Lab, Dublin
Charles Jochim, IBM Research Lab, Dublin
Francesca Bonin, IBM Research Lab, Dublin
Suchana Datta, University College Dublin
Dwaipayan Roy, GESIS, Cologne
Derek Greene, University College Dublin

Reach Out Here

Please reach out to the below in case you have any kind of queries related to the task.

cair.miners@gmail.com

Subscribe below for all latest updates.

cairminers@googlegroups.com