Over the last 10 years, probabilistic graphical models have become one of the cornerstones of modern machine learning. As science and engineering increasingly turn to them to solve difficult learning and data analysis problems, it is becoming more and more important to provide tools that make advanced statistical inference accessible to a broad community.
In this talk I will discuss my work on probabilistic programming, a recent generalization of graphical models: rather than marry statistics with graph theory, probabilistic programming marries Bayesian probability with computer science. It allows modelers to specify a complex generative process using syntax that resembles a modern programming language, easily defining distributions using language features such as data structures, recursion, or native libraries.
Scalable inference is the key challenge. I will discuss how we are leveraging concepts from programming language theory (such as monads, anonymous functions, and memoization), as well as compiler design (program analysis, source code transformations, nonstandard interpretations, and code factorization) to both define and implement universal inference algorithms for probabilistic programming languages. I will illustrate the results on a variety of tasks, with emphasis on inversion problems from geophysics.
Speaker Biography
David Wingate is a research scientist at MIT with a joint appointment in the Laboratory for Information Decision Systems and the Computational Cognitive Science group. He obtained a B.S. and M.S. in Computer Science from Brigham Young University, and a Ph.D. in Computer Science from the University of Michigan.
His research focuses on the intersection of probabilistic modeling (with an emphasis on Bayesian nonparametrics), machine learning, dynamical systems modeling, reinforcement learning and probabilistic programming.