Events
Events
CFCS Youth Talks

Better Algorithms and Generalization for Large-Scale Data

  • Hongyang Zhang, Stanford University
  • Time: 2019-04-04 11:15
  • Host: Prof. Xiaotie Deng
  • Venue: Room 204, Courtyard No.5, Jingyuan

Abstract

Over the past decade, machine learning models such as deep neural nets have made lots of impact on a variety of tasks involving large-scale data. On the other hand, our understanding of when and why such ML models work are still very limited. Answering these questions often require better understanding of the latent structures of the data, as well as better understanding of the optimization paradigm used in practice. My research aims to provide theoretical foundations and better algorithms to this emerging domain. This talk will show a few results. First, we study non-convex methods and their generalization performance (or sample efficiency) for common ML tasks. We consider over-parameterized models such as matrix and tensor factorizations. Our result highlights the role of the algorithm in explaining generalization for over-parameterized models, and the benefit to optimization by over-parameterization. Next, we consider the problem of predicting the missing entries of tensor data. We show that understanding generalization can inform the choice of the right model. Lastly, we revisit the shortest path querying problem on large graphs. We provide new algorithms to this classic problem by formalizing the structures of social network data.

Biography

Hongyang Zhang is a Ph.D. candidate in CS at Stanford University, co-advised by Ashish Goel and Greg Valiant. His research interests lie in machine learning and algorithms, including topics such as neural networks, non-convex optimization, social network analysis and game theory. He is a recipient of the best paper award at COLT'18.