Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Matrix-valued interpolations #958

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Open

Conversation

termi-official
Copy link
Member

@termi-official termi-official commented May 28, 2024

Foundation for #398 . Also, we basically emulate this in FerriteViz.jl when visualizing stress and in the AMR error estimation when computing the stress error. However, I would like to keep this API internal for now.

TODO

  • Tests

@termi-official termi-official marked this pull request as ready for review May 28, 2024 19:52
Copy link

codecov bot commented May 28, 2024

Codecov Report

Attention: Patch coverage is 42.85714% with 36 lines in your changes are missing coverage. Please review.

Project coverage is 93.17%. Comparing base (5e894e4) to head (6312f56).
Report is 2 commits behind head on master.

Files Patch % Lines
src/interpolations.jl 42.85% 32 Missing ⚠️
src/FEValues/FunctionValues.jl 42.85% 4 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #958      +/-   ##
==========================================
- Coverage   93.71%   93.17%   -0.54%     
==========================================
  Files          36       36              
  Lines        5313     5366      +53     
==========================================
+ Hits         4979     5000      +21     
- Misses        334      366      +32     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@KnutAM
Copy link
Member

KnutAM commented May 29, 2024

Very cool to be able to work directly with tensorial fields!
However, would it make sense to rather talk about tensors instead of just a matrix, we could do e.g.

abstract type TensorInterpolation{TB, refshape, iporder} <: Interpolation{refshape, iporder, Nothing} end

struct TensorizedInterpolation{TB, refshape, iporder, SI <: ScalarInterpolation{refshape, iporder}} <: TensorInterpolation{TB, refshape, iporder}
    ip::SI
    function TensorizedInterpolation{TB}(ip::ScalarInterpolation{refshape, iporder}) where {TB<:AbstractTensor{<:Any, <:Any, <:Any}, refshape, iporder}
        return new{TB, refshape, iporder, typeof(ip)}(ip)    
    end
end

where TB is always Tensors.get_base(TT::Type{<:AbstractTensor}).

This would allow

julia> ip = Lagrange{RefTriangle, 1}();

julia> TensorizedInterpolation{Tensor{2,3}}(ip)
TensorizedInterpolation{Tensor{2, 3}, RefTriangle, 1, Lagrange{RefTriangle, 1, Nothing}}(Lagrange{RefTriangle, 1}())

julia> TensorizedInterpolation{SymmetricTensor{2,2}}(ip)
TensorizedInterpolation{SymmetricTensor{2, 2}, RefTriangle, 1, Lagrange{RefTriangle, 1, Nothing}}(Lagrange{RefTriangle, 1}())

This allows re-using e.g. Tensors.n_components and even allows us to initialize the correct type of the shape value with e.g. TB{Float64}

This will also play nicely with mixed tensors, i.e.

julia> struct MixedTensor{order, dims, T, M} <: AbstractTensor{order, dims, T}
           data::NTuple{M, T}
       end

julia> TensorizedInterpolation{MixedTensor{2,(2,3)}}(ip)
TensorizedInterpolation{MixedTensor{2, (2, 3)}, RefTriangle, 1, Lagrange{RefTriangle, 1, Nothing}}(Lagrange{RefTriangle, 1}())

@termi-official
Copy link
Member Author

Very cool to be able to work directly with tensorial fields! However, would it make sense to rather talk about tensors instead of just a matrix, we could do e.g.

abstract type TensorInterpolation{TB, refshape, iporder} <: Interpolation{refshape, iporder, Nothing} end

struct TensorizedInterpolation{TB, refshape, iporder, SI <: ScalarInterpolation{refshape, iporder}} <: TensorInterpolation{TB, refshape, iporder}
    ip::SI
    function TensorizedInterpolation{TB}(ip::ScalarInterpolation{refshape, iporder}) where {TB<:AbstractTensor{<:Any, <:Any, <:Any}, refshape, iporder}
        return new{TB, refshape, iporder, typeof(ip)}(ip)    
    end
end

where TB is always Tensors.get_base(TT::Type{<:AbstractTensor}).

This would allow

julia> ip = Lagrange{RefTriangle, 1}();

julia> TensorizedInterpolation{Tensor{2,3}}(ip)
TensorizedInterpolation{Tensor{2, 3}, RefTriangle, 1, Lagrange{RefTriangle, 1, Nothing}}(Lagrange{RefTriangle, 1}())

julia> TensorizedInterpolation{SymmetricTensor{2,2}}(ip)
TensorizedInterpolation{SymmetricTensor{2, 2}, RefTriangle, 1, Lagrange{RefTriangle, 1, Nothing}}(Lagrange{RefTriangle, 1}())

This allows re-using e.g. Tensors.n_components and even allows us to initialize the correct type of the shape value with e.g. TB{Float64}

This will also play nicely with mixed tensors, i.e.

julia> struct MixedTensor{order, dims, T, M} <: AbstractTensor{order, dims, T}
           data::NTuple{M, T}
       end

julia> TensorizedInterpolation{MixedTensor{2,(2,3)}}(ip)
TensorizedInterpolation{MixedTensor{2, (2, 3)}, RefTriangle, 1, Lagrange{RefTriangle, 1, Nothing}}(Lagrange{RefTriangle, 1}())

Indeed. This is one of the reasons why I would like to keep the API internal for now.I first need to wrap my head around a generic indexing logic and I need to think of a way to properly generalize operations like curl and divergence to be useful by default and to be customizeable if the user works with a slightly different definition.

Copy link
Member

@KnutAM KnutAM left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, ok. I'm not sure if I fully understand the purpose? If this is only internal because we expect it to change, then it seems like it could be tested outside as well? I don't see any code that isn't just overloading with new types?

Or is the purpose to move functionality from FerriteViz to Ferrite?

tosvec(v::Vec) = SVector((v...,))
tovec(sv::SVector) = Vec((sv...))
val = shape_value(ipm, ξ, I)
grad = ForwardDiff.jacobian(sv -> tosmat(shape_value(ipm, tovec(sv), I)), tosvec(ξ))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this will do the right thing unfortunately.
But for these matrix interpolations, it probably is better to calculate the gradients wrt. the scalar base-interpolation, and then just insert the values in the right places (and zeros elsewhere).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you have any suggestion how we can cover this case in testing (in addition to the vectorial case)?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think @lijas solution here was quite nice, just having the option of calling that function explicitly as setup here.

Maybe combined with calling something like

for Is in 1:getnbasefunctions(ipm.ip)
    g, v = shape_gradient_and_value(ipm.ip, ξ, Is)
    Im = (Is - 1)*getnbasefunctions(ipm.ip)
    for i in 1:vdim1, j in 1:vdim2
        Im += 1
        G, V = shape_gradient_and_value(ipm.ip, ξ, Im)
        @test G[i,j,:] \approx g
        @test V[i,j] \approx v
    end
end

@termi-official
Copy link
Member Author

If this is only internal because we expect it to change, then it seems like it could be tested outside as well?

What do you mean by outside? If you propose to solve a matrix-valued problem, then I do not have the time currently to implement it.

I don't see any code that isn't just overloading with new types?

Or is the purpose to move functionality from FerriteViz to Ferrite?

We use something very similar to this PR in FerriteViz and this is beneficial to simplify ZZ error estimators. In the future I want to also highlight how we can solve matrix-valued problems (e.g. DG for elasticity) with the API. However, Fredrik repeatedly told me to make smaller PRs to make them more reviewable, which I can fully understand. So here is it. Does this explain it?

@KnutAM
Copy link
Member

KnutAM commented May 29, 2024

What do you mean by outside?

I meant that one can use this even without merging this into Ferrite (as opposed to changes to fields of some structs for example). That means that when trying this out (since the way how to implement this is not set as you say), we don't need to merge it in as an experimental feature. But with the code in this PR as a basis, that is great to be able to figure out what the best way is!
And based on this PR, we can then try wrapping our heads around the challenges you mentioned above.

@koehlerson koehlerson mentioned this pull request Jun 11, 2024
37 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants