Assessing Quality of Information without Ground Truth
- Prof. Yiling Chen, Harvard University
- Time: 2019-08-05 10:30
- Host: Prof. Xiaotie Deng
- Venue: Room 102, Courtyard No.5, Jingyuan
Abstract
Strictly proper scoring rules (SPSR) are well-studied tools for eliciting private information when the ground truth is available. SPSR has two nice properties: (1) the score of a report quantifies the quality of the reported information and (2) they incentivize truthful information revelation — an agent uniquely maximizes his expected score by reporting truthfully. In this work, we design scoring mechanisms for settings when there is no access to the ground truth and achieve the above two properties despite the lack of the ground truth. We consider two settings. In the first setting, the principal has access to a random variable that is a noisy or proxy version of the ground truth, with known biases. The second setting is the standard peer prediction setting where agents’ reports are the only source of information that the principal has. We introduce surrogate scoring rules (SSR) for the first setting and develop a multi-task scoring mechanism, the uniform dominant truth serum (DTS), for the second setting. DTS is one of the first mechanisms that achieve truthful elicitation in some notion of dominant strategy for the peer prediction setting. We then demonstrate the performance of DTS on 14 human-generated datasets and show that a DTS-based aggregation method has robust performance across the datasets.
This talk is based on joint work with Yang Liu and Juntao Wang.
Biography
Yiling Chen is a Gordon McKay Professor of Computer Science at Harvard University. She holds a doctoral degree in Information Sciences and Technology from the Pennsylvania State University, a Master’s degree from Tsinghua University and a Bachelor’s degree from Renmin University of China. Prior to working at Harvard, she spent two years at Yahoo! Research in New York City. Her current research focuses on topics in the intersection of computer science and economics. She was a recipient of NSF Career award and and The Penn State Alumni Association Early Career Award, and was selected by IEEE Intelligent Systems as one of "AI's 10 to Watch" in 2011. She was or currently is an associate editor for Journal of Artificial Intelligence Research, ACM Transactions on Economics and Computation and ACM Transactions on Social Computing. She has co-chaired the 2013 Conference on Web and Internet Economics (WINE'13), the 2016 ACM Conference on Economics and Computation (EC'16) and the 2018 AAAI Conference on Human Computation and Crowdsourcing (HCOMP'18).