Alternatives to Bpref

研究成果: Conference contribution

92 被引用数 (Scopus)

抄録

Recently, a number of TREC tracks have adopted a retrieval effectiveness metric called bpref which has been designed for evaluation environments with incomplete relevance data. A graded-relevance version of this metric called rpref has also been proposed. However, we show that the application of Q-measure, normalised Discounted Cumulative Gain (nDCG) or Average Precision (AveP)to condensed lists, obtained by ?ltering out all unjudged documents from the original ranked lists, is actually a better solution to the incompleteness problem than bpref. Furthermore, we show that the use of graded relevance boosts the robustness of IR evaluation to incompleteness and therefore that Q-measure and nDCG based on condensed lists are the best choices. To this end, we use four graded-relevance test collections from NTCIR to compare ten different IR metrics in terms of system ranking stability and pairwise discriminative power.

本文言語English
ホスト出版物のタイトルProceedings of the 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR'07
ページ71-78
ページ数8
DOI
出版ステータスPublished - 2007 11 30
外部発表はい
イベント30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR'07 - Amsterdam, Netherlands
継続期間: 2007 7 232007 7 27

出版物シリーズ

名前Proceedings of the 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR'07

Conference

Conference30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR'07
CountryNetherlands
CityAmsterdam
Period07/7/2307/7/27

ASJC Scopus subject areas

  • Information Systems
  • Software
  • Applied Mathematics

フィンガープリント 「Alternatives to Bpref」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル