Skip to content

Commit

Permalink
Merge pull request #144 from eWaterCycle/julia
Browse files Browse the repository at this point in the history
Add Julia support to gprc4bmi
  • Loading branch information
sverhoeven authored Oct 30, 2023
2 parents 271ff01 + 2fb9ffb commit 37040d6
Show file tree
Hide file tree
Showing 16 changed files with 1,656 additions and 12 deletions.
9 changes: 7 additions & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ jobs:
run: |
python -m pip install --upgrade pip wheel
pip install -r dev-requirements.txt
pip install -e .[R]
pip install -e .[R,julia]
- name: Setup Apptainer
uses: eWaterCycle/setup-apptainer@v2
with:
Expand All @@ -45,13 +45,18 @@ jobs:
run: |
Rscript -e "install.packages('remotes')"
Rscript -e "install.packages('R6')"
- name: Install Julia
uses: julia-actions/setup-julia@v1
with:
version: '^1.9'
- name: Test with pytest
run: |
pytest -vv --cov=grpc4bmi --cov-report xml
timeout-minutes: 20
- name: Correct coverage paths
run: sed -i "s+$PWD/++g" coverage.xml
- name: SonarCloud analysis
uses: sonarsource/sonarcloud-github-action@v1.3
uses: sonarsource/sonarcloud-github-action@v2.0.2
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
Expand Down
39 changes: 35 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,12 @@ on the client (Python) side. If your server model is implemented in Python, do t
pip install grpc4bmi[R]
```

If the model is implemented in Julia, run instead

```bash
pip install grpc4bmi[julia]
```

in the server environment. For bleeding edge version from GitHub use

```bash
Expand Down Expand Up @@ -90,6 +96,35 @@ For example with [WALRUS](https://github.com/eWaterCycle/grpc4bmi-examples/tree/
run-bmi-server --lang R --path ~/git/eWaterCycle/grpc4bmi-examples/walrus/walrus-bmi.r --name WalrusBmi --port 55555
```

### Models written in Julia

The grpc4bmi Python package can also run BMI models written in Julia if the model has an implementation of the [BasicModelInterface.jl](https://github.com/Deltares/BasicModelInterface.jl).

Run the Julia model in Python with

```bash
from grpc4bmi.bmi_julia_model import BmiJulia

mymodel = BmiJulia.from_name('<package>.<model>', 'BasicModelInterface')
```

For example with [Wflow.jl](https://github.com/Deltares/Wflow.jl/) use

```bash
# Install Wflow.jl package in the Julia environment managed by the juliacall Python package.
from juliacall import Main as jl
jl.Pkg.add("Wflow")
# Create the model
from grpc4bmi.bmi_julia_model import BmiJulia
mymodel = BmiJulia.from_name('Wflow.Model', 'Wflow.bmi.BMI')
```

A Julia model has to be run locally. It can not be run in the default gRPC client/server Docker container mode because:

1. Julia has no gRPC server implementation
2. Calling Julia methods from Python gRPC server causes 100% CPU usage and no progress
3. Calling Julia methods from C++ gRPC server causes segmentation faults

### The client side

The client side has only a Python implementation. The default BMI client assumes a running server process on a given port.
Expand Down Expand Up @@ -154,7 +189,3 @@ pip install -e .[docs]

and install the C++ runtime and `protoc` command as described in <https://github.com/google/protobuf/blob/master/src/README.md>.
After this, simply executing the `proto_gen.sh` script should do the job.

## Future work

More language bindings are underway.
5 changes: 5 additions & 0 deletions docs/container/building.rst
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,11 @@ The WALRUS model has a `Dockerfile`_ file which can be used as an example.

.. _Dockerfile: https://github.com/eWaterCycle/grpc4bmi-examples/blob/master/walrus/Dockerfile

Julia
-----

A Julia model can not be run as a server, see https://github.com/eWaterCycle/grpc4bmi/blob/main/README.md#model-written-in-julia .

C/C++/Fortran
-------------

Expand Down
3 changes: 2 additions & 1 deletion grpc4bmi/bmi_client_docker.py
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,8 @@ def __init__(self, image: str, work_dir: str, image_port=50051, host=None,
super(BmiClientDocker, self).__init__(BmiClient.create_grpc_channel(port=port, host=host), timeout=timeout)

def __del__(self):
self.container.stop()
if hasattr(self, 'container'):
self.container.stop()

def logs(self) -> str:
"""Returns complete combined stdout and stderr written by the Docker container.
Expand Down
8 changes: 4 additions & 4 deletions grpc4bmi/bmi_client_subproc.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,19 +15,19 @@ class BmiClientSubProcess(BmiClient):
>>> mymodel = BmiClientSubProcess(<PACKAGE>.<MODULE>.<CLASS>)
"""

def __init__(self, module_name, path=None, timeout=None):
def __init__(self, module_name, path=None, timeout=None, delay=1):
host = "localhost"
port = BmiClient.get_unique_port(host)
name_options = ["--name", module_name]
port_options = ["--port", str(port)]
path_options = ["--path", path] if path else []
self.pipe = subprocess.Popen(["run-bmi-server"] + name_options + port_options + path_options, env=dict(os.environ))
time.sleep(1)
time.sleep(delay)
super(BmiClientSubProcess, self).__init__(BmiClient.create_grpc_channel(port=port, host=host), timeout=timeout)

def __del__(self):
self.pipe.terminate()
self.pipe.wait()
self.pipe.kill()
self.pipe.wait(timeout=0.1)

def get_value_ref(self, var_name):
raise NotImplementedError("Cannot exchange memory references across process boundary")
Loading

0 comments on commit 37040d6

Please sign in to comment.