Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added threading to ABM algorithms using @.. #2556

Draft
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

mmesiti
Copy link

@mmesiti mmesiti commented Dec 13, 2024

  1. in *_perform_step.jl in functions using @.. in unpacking of the "cache" object add a "thread" variable
  2. in *_caches.jl for the cache types affected (determined in 1.)
    • add parametric type Thread
    • and a field thread::Thread
    • in the alg_cache function, add 'alg.thread' as argument to the cache type constructor
  3. in *_algorithms.jl, for the alg type affected (determined in 2.)
    • add parametric type Thread
    • add field thread::Thread
    • add outer constructor taking a thread argument with default value False(), that builds the respective algorithm object forwarding the thread parameter to it.

Checklist

  • Appropriate tests were added
  • Any code changes were done in a way that does not break public API
  • All documentation related to code changes were updated
  • The new code follows the
    contributor guidelines, in particular the SciML Style Guide and
    COLPRAC.
  • Any new documentation only uses public API

Additional context

Add any other context about the problem here.

Referring to this discussion: https://discourse.julialang.org/t/differentialequations-jl-mpi-jl-pencilarrays-jl-lack-of-scaling-observed/122998/18?u=mmesiti

1. in *_perform_step.jl
   in functions using @..
   in unpacking of the "cache" object
   add a "thread" variable
2. in *_caches.jl
   for the cache types affected
   (determined in 1.)
   - add parametric type Thread
   - and a field thread::Thread
   - in the alg_cache function,
     add 'alg.thread' as argument to the cache type constructor
3. in *_algorithms.jl,
   for the alg type affected
   (determined in 2.)
   - add parametric type Thread
   - add field thread::Thread
   - add outer constructor taking a thread argument
     with default value False(),
     that builds the respective algorithm object
     forwarding the thread parameter to it.
@ChrisRackauckas
Copy link
Member

This is set as a draft, but it seems like it's correct? What's the incomplete part here?

@mmesiti
Copy link
Author

mmesiti commented Dec 16, 2024

This is set as a draft, but it seems like it's correct? What's the incomplete part here?

Still need to make sure all the boxes can be ticked!

@ChrisRackauckas
Copy link
Member

If you can add some regression tests for the threading choices then I think that's sufficient.

Michele Mesiti added 2 commits December 18, 2024 11:55
Run the reformatting command
suggested in CONTRIBUTING.md,
but keeping only the changes on the ABM directory
(which my current PR is affecting)
@mmesiti
Copy link
Author

mmesiti commented Dec 18, 2024

If you can add some regression tests for the threading choices then I think that's sufficient.

Where should those tests go? Should they be included in the runtests.jl of the ABM subpackage (as I have done now) or should they go somewhere else, where they are guaranteed to run with multiple threads?

@ChrisRackauckas
Copy link
Member

https://github.com/mmesiti/OrdinaryDiffEq.jl/blob/multithreaded-ABM/lib/OrdinaryDiffEqAdamsBashforthMoulton/test/abm_convergence_tests.jl extend the convergence tests to have a few of the multithreaded choices.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants