-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/ms wells - part 2: Solving, straightforward option #5746
base: master
Are you sure you want to change the base?
Conversation
jenkins build this failure_report please |
e19bdf6
to
8757d8c
Compare
jenkins build this failure_report please |
8757d8c
to
9c055ce
Compare
jenkins build this failure_report please |
10e3d24
to
8b9be1d
Compare
jenkins build this failure_report please |
6a8e50a
to
89530ad
Compare
jenkins build this opm-tests=1250 failure_report please |
1 similar comment
jenkins build this opm-tests=1250 failure_report please |
jenkins build this opm-tests=1250 please |
89530ad
to
dcef937
Compare
jenkins build this opm-tests=1250 please |
dcef937
to
b3130be
Compare
jenkins build this opm-tests=1250 please |
b3130be
to
9e36faa
Compare
jenkins build this opm-tests=1250 please |
…Test.sh via the flag -s
9e36faa
to
8e01771
Compare
jenkins build this opm-tests=1250 please |
jenkins build this failure_report opm-tests=1250 please |
return mswellhelpers::applyUMFPack(*duneDSolver_, rhs); | ||
} | ||
|
||
template<class Scalar, int numWellEq, int numEq> | ||
void MultisegmentWellEquations<Scalar,numWellEq,numEq>:: | ||
recoverSolutionWell(const BVector& x, BVectorWell& xw) const | ||
{ | ||
BVectorWell resWell = resWell_; | ||
BVectorWell Bx(duneB_.N()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would recommend to use the old way, when the well is on one process here.
Maybe we can also factor this out similar as for standard wells, or refactor that code to reuse here.
I am a bit late, sorry.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! I'll try that!
8e01771
to
8645660
Compare
jenkins build this failure_report opm-tests=1250 please |
8645660
to
056e2fe
Compare
jenkins build this failure_report opm-tests=1250 please |
496fcca
to
c9f26cd
Compare
jenkins build this failure_report opm-tests=1250 please |
c9f26cd
to
f694f9d
Compare
jenkins build this opm-tests=1250 please |
… true, do not throw when initalizing distributed multi-segment wells in WellState.cpp
Here we go from cells to segments, and everything concerning segments is stored globally.
…olutionWell Here we go from cells to segments, and everything concerning segments is stored globally.
… can do this on all processes
f694f9d
to
4801a17
Compare
jenkins build this opm-tests=1250 please |
jenkins build this opm-tests=1250 serial please |
This PR is based on #5680.
This PR enables simulations with MSWell distributed across several processes. The PR #5680 covered the assembly of the system, this PR covers the straightforward option of solving.
This PR adds communication when multiplying x_well with the matrix B of the Schur complement as explained on slide 13 of 24-10-21-MSWells.pdf.
This approach is the easiest way to solve, using preconditioners for the well system is to be implemented in the future.
I've added two tests comparing the SMRY files of a parallel and a sequential simulation with distributet MSWells. These tests fail in the PR #5747 because they do not run through, for this PR they pass.