-
Notifications
You must be signed in to change notification settings - Fork 768
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hybrid Elimination Improvements #1575
Conversation
I changed the way we prune the discrete probabilities by pruning the joint distribution rather than the conditionals. This gives a 3x speedup. Maybe we should be pruning before discrete-only elimination. BeforeNumber of timesteps: K = 16
AfterNumber of timesteps: K = 16
|
Does this PR make both these changes? I'd prefer to review a PR that just does the tablefactor and shows the speedup... |
This PR only updates the discrete elimination to prune the joint distribution. Updated PR description to reflect the changes happening here. |
I'd still like to request to split into TableFactor related changes and other, and PR straight to develop? Otherwise the base branch will become an un-reviewable kitchen-sink PR. PS CI seems to fail, so splitting might help there as well. |
In that case I'm going to have to do some cherry picking and force pushing. |
7a3aa89
to
2b72f75
Compare
f3c14a0
to
e47531e
Compare
It won't be a kitchen sink PR if we merge in the parent PRs first. |
e47531e
to
66f060c
Compare
0fc5ce9
to
2f3fcff
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some comments. This PR changed too many things at once to be sure of anything, though.
@@ -299,6 +299,42 @@ namespace gtsam { | |||
/// Return the number of leaves in the tree. | |||
size_t nrLeaves() const; | |||
|
|||
/** | |||
* @brief This is a convenience function which returns the total number of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spelling. And why are we adding it ? And why is the implementation recursive.
I would just as well delete it unless it has a purpose.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm a bit lost on which word is misspelled. The purpose is to help with testing and ensure correctness as a convenience method.
@@ -231,7 +231,7 @@ TEST(HybridBayesNet, Pruning) { | |||
auto prunedTree = prunedBayesNet.evaluate(delta.continuous()); | |||
|
|||
// Regression test on pruned logProbability tree | |||
std::vector<double> pruned_leaves = {0.0, 20.346113, 0.0, 19.738098}; | |||
std::vector<double> pruned_leaves = {0.0, 32.713418, 0.0, 31.735823}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's happening here? Why are regressions changing ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The decision tree is normalizing the values based on the introduced zeros from pruning. Since I changed the way we're pruning (on the joint rather than the conditionals), the normalizing factor has changed.
double density = exp(logProbability); | ||
EXPECT_DOUBLES_EQUAL(density, actualTree(discrete_values), 1e-9); | ||
EXPECT_DOUBLES_EQUAL(density, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was not a regression but suddenly there is an arbitrary mult factor here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is the multiplicative factor for the different normalization constant.
@@ -63,8 +63,8 @@ TEST(MixtureFactor, Printing) { | |||
R"(Hybrid [x1 x2; 1] | |||
MixtureFactor | |||
Choice(1) | |||
0 Leaf Nonlinear factor on 2 keys | |||
1 Leaf Nonlinear factor on 2 keys | |||
0 Leaf [1]Nonlinear factor on 2 keys |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
spacing is weird in this case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, merge this at will :-)
apply
withUnaryAssignment
forDecisionTreeFactor
.DiscreteFactor
where applicable.Switching.h
to remove duplication.