Skip to content

Commit

Permalink
Keep BmiJulia and heat-images/julia-c, remove other mentions of Julia
Browse files Browse the repository at this point in the history
  • Loading branch information
sverhoeven committed Oct 25, 2023
1 parent 9f413af commit 125560a
Show file tree
Hide file tree
Showing 9 changed files with 19 additions and 96 deletions.
20 changes: 15 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,21 +100,31 @@ run-bmi-server --lang R --path ~/git/eWaterCycle/grpc4bmi-examples/walrus/walrus

The grpc4bmi Python package can also run BMI models written in Julia if the model has an implementation of the [BasicModelInterface.jl](https://github.com/Deltares/BasicModelInterface.jl).

Run the Julia model as a server with
Run the Julia model in Python with

```bash
run-bmi-server --lang julia --name <MODEL-NAME> --port <PORT>
from grpc4bmi.bmi_julia_model import BmiJulia

mymodel = BmiJulia.from_name('<package>.<model>', 'BasicModelInterface')
```

For example with [Wflow.jl](https://github.com/Deltares/Wflow.jl/) use

```bash
# Install Wflow.jl package in the Julia environment managed by the juliacall Python package.
python3 -c 'from grpc4bmi.bmi_julia_model import install;install("Wflow")'
# Run the server
run-bmi-server --lang julia --name Wflow.Model --port 55555
from juliacall import Main as jl
jl.Pkg.add("Wflow")
# Create the model
from grpc4bmi.bmi_julia_model import BmiJulia
mymodel = BmiJulia.from_name('Wflow.Model', 'Wflow.bmi.BMI')
```

A Julia model has to be run locally. It can not be run in the default gRPC client/server Docker container mode because:

1. Julia has no gRPC server implementation
2. Calling Julia methods from Python gRPC server causes 100% CPU usage and no progress
3. Calling Julia methods from C++ gRPC server causes segmentation faults

### The client side

The client side has only a Python implementation. The default BMI client assumes a running server process on a given port.
Expand Down
1 change: 0 additions & 1 deletion cpp/bmi_grpc_server.cc
Original file line number Diff line number Diff line change
Expand Up @@ -824,7 +824,6 @@ void run_bmi_server(BmiClass *model, int argc, char *argv[])
grpc::EnableDefaultHealthCheckService(true);
grpc::reflection::InitProtoReflectionServerBuilderPlugin();
grpc::ServerBuilder builder;
// builder.SetResourceQuota(grpc::ResourceQuota().SetMaxThreads(2));
builder.AddListeningPort(server_address, grpc::InsecureServerCredentials());
builder.RegisterService(&service);
std::unique_ptr<grpc::Server> server(builder.BuildAndStart());
Expand Down
21 changes: 1 addition & 20 deletions docs/container/building.rst
Original file line number Diff line number Diff line change
Expand Up @@ -71,26 +71,7 @@ The WALRUS model has a `Dockerfile`_ file which can be used as an example.
Julia
-----

The docker file for the model container simply contains the installation instructions of grpc4bmi and the BMI-enabled model itself, and as entrypoint the ``run-bmi-server`` command. For the :ref:`python example <python-example>` the Docker file will read

.. code-block:: Dockerfile
FROM ubuntu:jammy
MAINTAINER your name <your email address>
# Install grpc4bmi
RUN pip install grpc4bmi
# Install your BMI model:
python3 -c 'from grpc4bmi.bmi_julia_model import install;install("<JULIA-PACKAGE-NAME>")'
# Run bmi server
ENTRYPOINT ["run-bmi-server", "--lang", "julia", "--name", "<MODEL-NAME>"]
# Expose the magic grpc4bmi port
EXPOSE 55555
The port 55555 is the internal port in the Docker container that the model communicates over. It is the default port for ``run_bmi_server`` and also the default port that all clients listen to.
A Julia model can not be run as a server, see https://github.com/eWaterCycle/grpc4bmi/blob/main/README.md#model-written-in-julia .

C/C++/Fortran
-------------
Expand Down
47 changes: 0 additions & 47 deletions docs/server/Julia.rst

This file was deleted.

1 change: 0 additions & 1 deletion docs/server/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,5 +7,4 @@ Creating a BMI server

python
R
Julia
Cpp
4 changes: 2 additions & 2 deletions grpc4bmi/bmi_julia_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -354,7 +354,7 @@ def get_value_at_indices(self, name: str, dest: np.ndarray, inds: np.ndarray) ->
self.state,
name,
jl.convert(jl.Vector, dest),
jl.convert(jl.Vector, inds) + 1
jl.convert(jl.Vector, inds + 1)
)
return dest

Expand Down Expand Up @@ -391,7 +391,7 @@ def set_value_at_indices(
self.implementation.set_value_at_indices(
self.state,
name,
jl.convert(jl.Vector, inds) + 1,
jl.convert(jl.Vector, inds + 1),
jl.convert(jl.Vector, src),
)

Expand Down
18 changes: 0 additions & 18 deletions grpc4bmi/run_server.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,6 @@
except ImportError:
BmiR = None

try:
from .bmi_julia_model import BmiJulia
except ImportError:
BmiJulia = None

"""
Run server script, turning a BMI implementation into an executable by looping indefinitely, until interrupt signals are
handled. The command line tool needs at least a module and class name to instantiate the BMI wrapper class that exposes
Expand Down Expand Up @@ -78,11 +73,6 @@ def build_r(class_name, source_fn):
raise ValueError('Missing R dependencies, install with `pip install grpc4bmi[R]')
return BmiR(class_name, source_fn)

def build_julia(name: str, implementation_name: str = 'BasicModelInterface'):
if not BmiJulia:
raise ValueError('Missing Julia dependencies, install with `pip install grpc4bmi[julia]')
return BmiJulia.from_name(name, implementation_name)

def serve(model, port):
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
bmi_pb2_grpc.add_BmiServiceServicer_to_server(model, server)
Expand Down Expand Up @@ -119,12 +109,6 @@ def main(argv=sys.argv[1:]):

if args.language == "R":
model = build_r(args.name, path)
elif args.language == "julia":
names = args.name.split(',')
if len(names) == 2:
model = build_julia(names[0], names[1])
else:
model = build_julia(names[0])
else:
model = build(args.name, path)

Expand Down Expand Up @@ -157,8 +141,6 @@ def build_parser():
lang_choices = ['python']
if BmiR:
lang_choices.append('R')
if BmiJulia:
lang_choices.append('julia')
parser.add_argument("--language", default="python", choices=lang_choices,
help="Language in which BMI implementation class is written")
parser.add_argument("--bmi-version", default="2.0.0", choices=["2.0.0", "0.2"],
Expand Down
2 changes: 1 addition & 1 deletion test/fake.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ BMI.initialize(::Type{Model}, config_file) = Model()

BMI.get_component_name(m::Model) = "The 2D Heat Equation"

function BMI.get_grid_x(m::Model, grid, x)
function BMI.get_grid_x(m::Model, grid, x::Vector{T}) where {T<:AbstractFloat}
copyto!(x, [1.0, 2.0])
end

Expand Down
1 change: 0 additions & 1 deletion test/test_julia.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,6 @@ def test_get_value_ptr(self, model: BmiJulia):
with pytest.raises(NotImplementedError):
model.get_value_ptr("plate_surface__temperature")

# TODO fix gives no method matching error
def test_get_value_at_indices(self, model: BmiJulia):
result = model.get_value_at_indices(
"plate_surface__temperature", np.zeros((3,)), np.array([5, 6, 7])
Expand Down

0 comments on commit 125560a

Please sign in to comment.