Evaluating diversified search results using per-intent graded relevance

Tetsuya Sakai*, Ruihua Song

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

89 Citations (Scopus)

Abstract

Search queries are often ambiguous and/or underspecified. To accomodate different user needs, search result diversification has received attention in the past few years. Accordingly, several new metrics for evaluating diversification have been proposed, but their properties are little understood. We compare the properties of existing metrics given the premises that (1) queries may have multiple intents; (2) the likelihood of each intent given a query is available; and (3) graded relevance assessments are available for each intent. We compare a wide range of traditional and diversified IR metrics after adding graded relevance assessments to the TREC 2009 Web track diversity task test collection which originally had binary relevance assessments. Our primary criterion is discriminative power, which represents the reliability of a metric in an experiment. Our results show that diversified IR experiments with a given number of topics can be as reliable as traditional IR experiments with the same number of topics, provided that the right metrics are used. Moreover, we compare the intuitiveness of diversified IR metrics by closely examining the actual ranked lists from TREC. We show that a family of metrics called D#-measures have several advantages over other metrics such as α-nDCG and Intent-Aware metrics.

Original languageEnglish
Title of host publicationSIGIR'11 - Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval
PublisherAssociation for Computing Machinery
Pages1043-1052
Number of pages10
ISBN (Print)9781450309349
DOIs
Publication statusPublished - 2011
Externally publishedYes
Event34th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2011 - Beijing, China
Duration: 2011 Jul 242011 Jul 28

Publication series

NameSIGIR'11 - Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval

Conference

Conference34th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2011
Country/TerritoryChina
CityBeijing
Period11/7/2411/7/28

Keywords

  • Ambiguity
  • Diversity
  • Evaluation
  • Graded relevance
  • Test collection

ASJC Scopus subject areas

  • Information Systems

Fingerprint

Dive into the research topics of 'Evaluating diversified search results using per-intent graded relevance'. Together they form a unique fingerprint.

Cite this