15.06.2016: Emti Khan

TITLE: Approximate Bayesian Inference : Bringing Statistics, Optimization, Machine Learning, and AI together.

 

ABSTRACT:

Machine learning relies heavily on data to design computers that can learn autonomously, but dealing with noisy, unreliable, heterogeneous, high-dimensional, and missing data is a big challenge in itself. Surprisingly, living beings - even young ones - are very good in dealing with such data. This raises the question: how do they do it, and how can we design computers that can learn like them?

 

Bayesian methods are promising in answering such questions, but they are computationally challenging, especially when data are large and models are complex. In this talk, I will start by showing a few example applications where this is the case. I will then discuss my work which solves many computational challenges associated with Bayesian methods by converting the "Bayesian integration" problem into an optimization problem. I will outline some of my future plans to design linear-time algorithms for Bayesian inference. Overall, I will argue that, by combining ideas from statistics, optimization, machine learning, and artificial intelligence, we might be able to design computers that can learn autonomously, just like us.