Events
Events
CFCS Youth Forum

Dynamic Neural Networks for Efficient Learning and Inference

  • Xin Wang, UC Berkeley
  • Time: 2019-04-04 10:45
  • Host: Prof. Xiaotie Deng
  • Venue: Room 204, Courtyard No.5, Jingyuan

Abstract

Over the past years, researchers have developed various neural network architectures that largely push the boundary of all kinds of applications from fundamental classification, detection and segmentation tasks to more challenging applications, e.g., autonomous driving, robotic manipulation, etc. However, such success often relies on a huge amount of computation, and the model generalization capability may suffer when only a limited number of data are available for model training. The main goal of my research is to build efficient learning systems that can learn from a limited amount of data as well as rendering predictions with low computational cost. My research has been focused on making the neural network dynamic based on the changing environment and the visual inputs, which I argue is closer to how human brains work.

In this talk, I'm going to talk about our two recent works on the dynamic neural networks design. The first one is SkipNet, a modified residual network, that uses a gating network to selectively skip convolutional blocks based on the activations of the previous layer. We evaluate SkipNet on various benchmark datasets to show that it can reduce the runtime computational cost substantially without decreasing the prediction accuracy. The second one is our recent work on task-aware feature embedding networks (TAFE-Net), which learns how to adapt the image representation to a new task in a meta-learning fashion. We find that TAFE-Net outperforms previous approaches on several zero-shot learning benchmarks and the more challenging attribute-object composition task.

Biography

Xin Wang is currently a Ph.D. candidate in the Computer Science Department at UC Berkeley, advised by Professor Trevor Darrell and Joseph E. Gonzalez. She is part of the Berkeley AI Research (BAIR) Lab, Berkeley DeepDrive(BDD) Lab and the RISE Lab. Her research interest lies at the intersection of computer vision and learning systems with an emphasis on dynamic neural network designs for efficient learning and inference. She is also interested in interactive data analysis systems and low latency model serving systems.  She is a key member on the web-based annotation platform, Scalabel, developed at BDD and a former member on the Clipper project, a low latency model serving system now maintained by RISE Lab. Prior to Berkeley, she obtained her bachelor's degree from Shanghai Jiao Tong University under the supervision of Prof. Xiaotie Deng and Prof. Bo Yuan.