Yuqing Kong is currently an assistant professor at the Center on Frontiers of Computing Studies (CFCS), Peking University. She obtained her Ph.D. degree from the Computer Science and Engineering Department at University of Michigan in 2018 and her bachelor degree in mathematics from University of Science and Technology of China in 2013.
Her research interests lie in the intersection of theoretical computer science and the areas of economics: information elicitation, prediction markets, mechanism design, and the future applications of these areas to crowdsourcing and machine learning. Her papers were published in several conferences include WINE, ITCS, EC, SODA, AAAI, NeurIPS, ICLR.
|Intersection of Theoretical Computer Science and Economics
1. Y. Kong, "Dominantly Truthful Multi-task Peer Prediction with a Constant Number of Tasks" In Proceedings of the Fourteenth Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), 2398-2411, Salt Lake City, USA, January 5-8, 2020.
2. Y. Kong, G. Schoenebeck, "Water from Two Rocks: Maximizing the Mutual Information," in proceedings of the 19th ACM Conference on Econ and Computation (EC), 177-194, Ithaca, NY, USA, June 18-22, 2018.
3. Y. Kong, G. Schoenebeck, "An Information Theoretic Framework For Designing Information Elicitation Mechanisms That Reward Truth-telling," in proceeding of the ACM Transactions on Economics and Computation (TEAC), 2:1-2:33, Volume 7 Issue 1, February 2019.
These are a series of works that employ the information theoretic approach into the design of peer prediction mechanisms and machine learning algorithms. This approach reduces the incentive design problems to the design of proper information measures.  initially proposed this information theoretic framework and  and  apply this framework into varies of settings. In particular,  proposes a new information measure, Determinant Mutual Information, and employs this new measure into the framework such that the sample complexity in the design of peer prediction mechanisms is able to reduced from infinite to a small constant.