Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix issue #249 #250

Merged
merged 9 commits into from
Jan 27, 2023
Merged

Fix issue #249 #250

merged 9 commits into from
Jan 27, 2023

Conversation

bvdmitri
Copy link
Member

This PR fixes #249 . The underlying issue happens to be related to ForwardDiff.jl. JuliaDiff/ForwardDiff.jl#622 JuliaDiff/ForwardDiff.jl#481

@bartvanerp
Copy link
Member

I think the PR looks good and gets rid of the ForwardDiff bug (at least for the 3 argument dot product). However, I am a bit hesitant of renaming all dot operations to xT_A_y for readability/simplicity sake. Would it be possible to instead use Julia multiple dispatch feature that only uses this new dot operation if we are processing dual numbers as:

function dot(x::AbstractVector, A::AbstractMatrix, y::AbstractVector{<:DualNumber})
    (axes(x)..., axes(y)...) == axes(A) || throw(DimensionMismatch())
    T = typeof(dot(first(x), first(A), first(y)))
    s = zero(T)
    i₁ = first(eachindex(x))
    x₁ = first(x)
    @inbounds for j in eachindex(y)
        yj = y[j]
        temp = zero(adjoint(A[i₁,j]) * x₁)
        @simd for i in eachindex(x)
            temp += adjoint(A[i,j]) * x[i]
        end
        s += dot(temp, yj)
    end
    return s
end

@bvdmitri
Copy link
Member Author

@bartvanerp Agreed, I've changed the PR to use multiple dispatch instead. Let's see if all tests are passing on CI.

@codecov-commenter
Copy link

codecov-commenter commented Jan 27, 2023

Codecov Report

Base: 61.61% // Head: 61.69% // Increases project coverage by +0.07% 🎉

Coverage data is based on head (f5f42b3) compared to base (f9d83ac).
Patch coverage: 94.73% of modified lines in pull request are covered.

📣 This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more

Additional details and impacted files
@@            Coverage Diff             @@
##           master     #250      +/-   ##
==========================================
+ Coverage   61.61%   61.69%   +0.07%     
==========================================
  Files         194      195       +1     
  Lines        7183     7200      +17     
==========================================
+ Hits         4426     4442      +16     
- Misses       2757     2758       +1     
Impacted Files Coverage Δ
src/ReactiveMP.jl 75.86% <ø> (ø)
src/nodes/autoregressive.jl 20.00% <ø> (ø)
src/fixes.jl 93.33% <93.33%> (ø)
src/distributions/mv_normal_mean_precision.jl 91.93% <100.00%> (ø)
...distributions/mv_normal_weighted_mean_precision.jl 98.00% <100.00%> (ø)
src/distributions/normal.jl 90.00% <100.00%> (+0.06%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

Copy link
Member

@Nimrais Nimrais left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @bvdmitri! Well Done!

@bvdmitri bvdmitri merged commit 351dea1 into master Jan 27, 2023
@bvdmitri bvdmitri deleted the dev-issue-249 branch January 27, 2023 15:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

The MvNormalMeanCovarience logpdf hessian computed with ForwardDiff.jl is incorrect
4 participants