You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm working with a large dataset with a relatively large number of parameters (last-layer approximation for a neural network). Out-of-the-box VI is simply a non-starter here.
To perform parameter updates in mini-batches, is scaling the contribution of the minibatch to the log-likelihood the primary change?
Thanks for your work!
The text was updated successfully, but these errors were encountered:
Hi, this is a little bit late for feedback, but the current codebase lacks the proper knobs to do doubly stochastic VI. You'll pretty much have to implement things from scratch. I have a personal codebase that does it, so please let me know if you need any assistance.
Yes I started looking into the codebase and when trying to make adjustments realized that the way things were implemented made a scratch implementation necessary. Decided to use Pyro since it supports this. Not sure how much active development this library still gets but it would be a tremendously useful feature for practical applications.
@spragud2 Hi, we're currently working on rewriting AdvancedVI entirely. Minibatching support is expected to come very soon after that. Hopefully by the end of this year or earlier next year.
Hey guys,
I'm working with a large dataset with a relatively large number of parameters (last-layer approximation for a neural network). Out-of-the-box VI is simply a non-starter here.
To perform parameter updates in mini-batches, is scaling the contribution of the minibatch to the log-likelihood the primary change?
Thanks for your work!
The text was updated successfully, but these errors were encountered: