TY - GEN
T1 - Alternatives to Bpref
AU - Sakai, Tetsuya
PY - 2007
Y1 - 2007
N2 - Recently, a number of TREC tracks have adopted a retrieval effectiveness metric called bpref which has been designed for evaluation environments with incomplete relevance data. A graded-relevance version of this metric called rpref has also been proposed. However, we show that the application of Q-measure, normalised Discounted Cumulative Gain (nDCG) or Average Precision (AveP)to condensed lists, obtained by ?ltering out all unjudged documents from the original ranked lists, is actually a better solution to the incompleteness problem than bpref. Furthermore, we show that the use of graded relevance boosts the robustness of IR evaluation to incompleteness and therefore that Q-measure and nDCG based on condensed lists are the best choices. To this end, we use four graded-relevance test collections from NTCIR to compare ten different IR metrics in terms of system ranking stability and pairwise discriminative power.
AB - Recently, a number of TREC tracks have adopted a retrieval effectiveness metric called bpref which has been designed for evaluation environments with incomplete relevance data. A graded-relevance version of this metric called rpref has also been proposed. However, we show that the application of Q-measure, normalised Discounted Cumulative Gain (nDCG) or Average Precision (AveP)to condensed lists, obtained by ?ltering out all unjudged documents from the original ranked lists, is actually a better solution to the incompleteness problem than bpref. Furthermore, we show that the use of graded relevance boosts the robustness of IR evaluation to incompleteness and therefore that Q-measure and nDCG based on condensed lists are the best choices. To this end, we use four graded-relevance test collections from NTCIR to compare ten different IR metrics in terms of system ranking stability and pairwise discriminative power.
KW - Evaluation metrics
KW - Graded relevance
KW - Test collection
UR - http://www.scopus.com/inward/record.url?scp=36448993626&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=36448993626&partnerID=8YFLogxK
U2 - 10.1145/1277741.1277756
DO - 10.1145/1277741.1277756
M3 - Conference contribution
AN - SCOPUS:36448993626
SN - 1595935975
SN - 9781595935977
T3 - Proceedings of the 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR'07
SP - 71
EP - 78
BT - Proceedings of the 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR'07
T2 - 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR'07
Y2 - 23 July 2007 through 27 July 2007
ER -