Search
Now showing items 1-3 of 3
Crowd vs. Expert: What can relevance judgment rationales teach us about assessor disagreement?
(
ACM
, 2018 , Conference Paper)
© 2018 ACM. While crowdsourcing offers a low-cost, scalable way to collect relevance judgments, lack of transparency with remote crowd work has limited understanding about the quality of collected judgments. In prior work, ...
Your Behavior Signals Your Reliability: Modeling Crowd Behavioral Traces to Ensure Quality Relevance Annotations
(
AAAI Press
, 2018 , Conference Paper)
While peer-agreement and gold checks are well-established methods for ensuring quality in crowdsourced data collection, we explore a relatively new direction for quality control: estimating work quality directly from ...
Mix and match: Collaborative expert-crowd judging for building test collections accurately and affordably
(
CEUR-WS
, 2018 , Conference Paper)
Crowdsourcing offers an affordable and scalable means to collect relevance judgments for information retrieval test collections. However, crowd assessors may showhigher variance in judgment quality than trusted assessors. ...