Fast Probabilistic Programming for Point Processes using Edward and Tensorflow
In this project, we explore efficient and flexible Bayesian inference for point process models using probabilistic programming. Our original interest was in modeling a publicly available dataset from the Federal Election Commission, which contains information on donations from individuals to candidates for federal office. We wanted to model the spatiotemporal dynamics of donations to answer questions like these: given the past few months of donation data, where should candidate
We're interested in using GP priors for learning the distributions over intensity functions of point process models. The basic model is as follows (there are a few variations on it):
We have points
We're also interested in continuous time/space point processes. Inference for this is more complicated, since it requires us to infer rate values where we have not seen any events. We follow Adams et al (2009) for our inference procedure on these processes.
Since black-box inference for Gaussian Processes requires computation and inversion of an
We also work on implementing efficient modifications of the algorithm from Adams et al (2009).
- kronecker.py: primary file for implementation of Kronecker methods
- data_utils.py, grid_utils.py, likelihoods.py: helpers for the kronecker methods
- thinnedEvents_eager.py : File for implementation of Poisson process inference using thinned events
- Final Presentation: Our final presentation
We plan to integrate the kronecker methods for use on inferring continuous time/space point process intensities. We are also working on kernel learning via marginal likelihood optimization over kernel hyperparameters, as well as implementing inducing point methods (see the KISS GP paper by Wilson and Nickisch (2015) for reference).