Skip to content

Commit

Permalink
docs: renew cloud computing documentation (#536)
Browse files Browse the repository at this point in the history
* docs: Added subcategories for cloud running

* docs: Corrected `deltares_harbor` page

* docs: Added subsection examples guidelines

* chore: renamed page title

* docs: Renamed subdirectory

* docs: Normalized document hyperlinks and inline code snippets

* docs: moved argo and kubernetes documentation into subdirectory

* docs: Moved docker related documents into different directories

* docs: Removed redundant subdirectory

* docs: Renamed subdirectory for a more relatable name

* chore: Normalized files

* chore: Corrected dockerfile and related documentation

* docs: Added section for hackathon use case

* docs: Small documentation correction

* docs: Added draft for subsection

* docs: Fix formatting for `hachathon_user_guide`

* docs: Updated formatting of documentation and fixed some links

* docs: Added reference to installation of `aws`

* docs: Fixed formatting of headers in `docker_user_guide.rst`
  • Loading branch information
Carsopre authored Jul 24, 2024
1 parent e62d493 commit 53db3a2
Show file tree
Hide file tree
Showing 14 changed files with 444 additions and 322 deletions.
48 changes: 15 additions & 33 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,38 +1,20 @@
# Run with `docker build -t ra2ce .`
FROM mambaorg/micromamba:1.4-alpine AS full
# To build this docker run:
# `docker build -t ra2ce`

# ENV_NAME is starting a bash inm this environment
FROM python:3.10

ENV HOME=/home/mambauser
ENV ENV_NAME=ra2ce_env
ENV PYTHONPATH="/home/mambauser:$PYTHONPATH"
RUN apt-get update && apt-get install -y libgdal-dev

# Setting workspace vbriables
# Copy the directories with the local ra2ce.
WORKDIR /ra2ce_src
COPY README.md LICENSE pyproject.toml poetry.lock /ra2ce_src/
COPY ra2ce /ra2ce_src/ra2ce

WORKDIR ${HOME}
USER mambauser
# RUN apt-get -qq update && apt-get install --yes --no-install-recommends libgdal-dev libgeos-dev libproj-dev && apt-get -qq purge && apt-get -qq clean && rm -rf /var/lib/apt/lists/*
COPY .config/docker_environment.yml pyproject.toml README.md ${HOME}/
RUN mkdir -p ${HOME}/.jupyter
COPY .config/jupyter/* ${HOME}/.jupyter
# Install the required packages
RUN pip install poetry
RUN poetry config virtualenvs.create false
RUN poetry install --without dev,docs,jupyter
RUN apt-get clean autoclean

# Creating ra2ce2_env

RUN micromamba create -f docker_environment.yml -y --no-pyc \
&& micromamba clean -ayf \
&& rm -rf ${HOME}/.cache \
&& find /opt/conda/ -follow -type f -name '*.a' -delete \
&& find /opt/conda/ -follow -type f -name '*.pyc' -delete \
&& find /opt/conda/ -follow -type f -name '*.js.map' -delete \
&& rm docker_environment.yml
COPY examples/ ${HOME}/examples
COPY ra2ce/ ${HOME}/ra2ce

# Installing notabook and Jupyter lab
# this is now in the docker_environment.yml

# Expose the Jupyter port
EXPOSE 8080

# Start Jupyter Notebook
CMD ["jupyter", "notebook", "--ip=0.0.0.0", "--port=8080", "--allow-root"]
# Define the endpoint
CMD ["python3"]
2 changes: 1 addition & 1 deletion docs/Makefile
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Minimal makefile for Sphinx documentation
#
# Build from ra2ce root directory with `poetry run docs\make html`

# You can set these variables from the command line.
SPHINXOPTS =
Expand Down
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ def remove_extra_files_from_dir(dir_path: Path):

# General information about the project.
project = "Risk Assessment and Adaptation for Critical infrastructurE"
copyright = "2020, Deltares"
copyright = "2024, Deltares"
author = "Margreet van Marle\\Frederique de Groen\\Lieke Meijer\\Sahand Asgarpour\\Carles Soriano Perez"

# The version info for the project you're documenting, acts as replacement
Expand Down
195 changes: 0 additions & 195 deletions docs/docker_and_cloud/docker_user_guide.rst

This file was deleted.

23 changes: 14 additions & 9 deletions docs/docker_and_cloud/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,19 +3,24 @@
Docker and Cloud guide
=========================

In this section we gather all information related to the "cloud", including:
In this section we explore the multiple possibilities regarding 'cloud running'. For such purpose we have documentation covering two different concepts:

- building of a docker container,
- installation and deployment of cloud services,
- running ``ra2ce`` on different cloud services.
- Setting up infrastructure.
- Building a docker container.
- Installation and deployment of cloud services.
- Using "cloud services" for specific purposes.
- Using existing docker images (locally / cloud).
- Running ``ra2ce`` on different cloud services.
- Guidelines to set up and run from scratch a cloud service:
- Known 'use cases' such as setting up and using cloud services for a hackathon,
- building of a docker container,
- installation and deployment of cloud services,
- running ``ra2ce`` on different cloud services.


.. toctree::
:caption: Table of Contents
:maxdepth: 1

docker_user_guide
cloud_user_guide
slurm_user_guide
argo_deployment
kubernetes_deployment
setting_up_infrastructure/index
using_cloud_services/index
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,16 @@ Prerequisites
Before deploying Argo Workflows, ensure you have the following prerequisites:

- An Amazon EKS cluster. Refer to the kubernetes_deployment.rst in the project directory for instructions on deploying an EKS cluster with Terraform.
- `kubectl` configured to interact with the deployed EKS cluster.
- ``kubectl`` configured to interact with the deployed EKS cluster.

.. _argo_local_installation:

Local installation
------------------

1. Download argo cli from the official website `<https://argo-workflows.readthedocs.io/en/latest/>`
2. Move the ``argo.exe`` to your directory of preference, here we will say ``C:\\cloud\\argo``.
3. Add said location to your ``PATH`` variables.

Deployment Steps
----------------
Expand All @@ -22,27 +31,27 @@ Follow these steps to deploy Argo Workflows on the Amazon EKS cluster:

Create a namespace for Argo to run in:

::
.. code-block:: bash
kubectl create namespace argo
2. **Install Argo Workflows:**

Update Helm repositories to ensure you have the latest information:

::
.. code-block:: bash
kubectl apply -n argo -f https://github.com/argoproj/argo-workflows/releases/download/v3.5.5/install.yaml
3. **Access Argo UI:**

Once the installation is complete, you can access the Argo UI by port-forwarding to the Argo server service:

::
.. code-block:: bash
kubectl -n argo port-forward service/argo-server 2746:2746
Open your web browser and navigate to `http://localhost:2746` to access the Argo UI.
Open your web browser and navigate to `<http://localhost:2746>`_ to access the Argo UI.

Clean Up
--------
Expand All @@ -51,7 +60,7 @@ To uninstall Argo Workflows from the EKS cluster:

1. **Uninstall Argo Workflows:**

::
.. code-block:: bash
kubectl delete deployment argo -n argo
Expand Down
24 changes: 24 additions & 0 deletions docs/docker_and_cloud/setting_up_infrastructure/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
.. _setting_up_infrastructure:

Setting up infrastructure
=========================

At the moment, the ``ra2ce`` "cloud" infrastructure consists of three main components:

- Amazon web services `s3 <https://deltares.awsapps.com/>`_.
- Stores data.
- Runs docker components through Kubernetes
- Kubernetes.
- Creates and runs the ``ra2ce`` docker images in containers.
- Runs custom scripts in the related containers.
- Argo.
- "Orchastrates" how a workflow will be run in the s3 using kubernetes.
- Workflows ar ``*.yml`` files describing the node types and resources to use at each step of a cloud run.


.. toctree::
:caption: Table of Contents
:maxdepth: 1

kubernetes_deployment
argo_deployment
Loading

0 comments on commit 53db3a2

Please sign in to comment.