Authentication API for the SciGateway web application
- scigateway-auth
- Creating Dev Environment and API Setup
- Running the API
- Project structure
- Running Tests
- Viewing Swagger Documentation
The recommended development environment for this API has taken lots of inspiration from the Hypermodern Python guide found online. It is assumed the commands shown in this part of the README are executed in the root directory of this repo once it has been cloned to your local machine.
To start, install pyenv. There is a Windows version of this tool (pyenv-win), however this is currently untested on this repo. This is used to manage the various versions of Python that will be used to test/lint Python during development. Install by executing the following:
curl https://pyenv.run | bash
The following lines need to be added to ~/.bashrc
- either open a new terminal or
execute source ~/.bashrc
to make these changes apply:
export PATH="~/.pyenv/bin:$PATH"
eval "$(pyenv init -)"
eval "$(pyenv virtualenv-init -)"
Various Python build dependencies need to be installed next. These will vary dependent on the platform of your system (see the common pyenv build problems for the relevant command for your OS), but the following shows the bash command to install the requirements for a CentOS/RHEL machine:
sudo yum install @development zlib-devel bzip2 bzip2-devel readline-devel sqlite \
sqlite-devel openssl-devel xz xz-devel libffi-devel findutils
To make use of pyenv
, let's install different versions of Python onto the system. In
production, SciGateway Auth uses Python 3.6, so this should definitely be part a
development environment for this repo. This stage might take some time as each Python
version needs to be downloaded and built individually:
pyenv install 3.6.8
pyenv install 3.7.7
pyenv install 3.8.2
pyenv install 3.9.0
To verify the installation commands worked:
python3.6 --version
python3.7 --version
python3.8 --version
python3.9 --version
These Python versions need to be made available to local version of the repository. They
will used during the Nox sessions, explained further down this file. Executing the
following command will create a .python-version
file inside the repo (this file is
currently listed in .gitignore
):
pyenv local 3.6.8 3.7.7 3.8.2 3.9.0
To maintain records of the API's dependencies, Poetry is used. To install, use the following command:
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
The installation requires ~/.poetry/env
to be refreshed for changes to be applied.
Open a new terminal or execute the following command to ensure the installation is
completed smoothly:
source ~/.poetry/env
The dependencies for this repo are stored in pyproject.toml
, with a more detailed
version of this data in poetry.lock
. The lock file is used to maintain the exact
versions of dependencies from system to system. To install the dependencies, execute the
following command (add --no-dev
if you don't want the dev dependencies):
poetry install
To add a dependency to Poetry, run the following command (add --dev
if it's a
development related dependency). The
official docs give good detail regarding the
intricacies of this command:
poetry add [PACKAGE-NAME]
When developing new features for the API, there are a number of Nox sessions that can be
used to lint/format/test the code in the included noxfile.py
. To install Nox, use Pip
as shown below. Nox is not listed as a Poetry dependency because this has the potential
to cause issues if Nox was executed inside Poetry (see
here
for more detailed reasoning). When using the --user
option, ensure your user's Python
installation is added to the system PATH
variable, remembering to reboot your system
if you need to change the PATH
. If you do choose to install these packages within a
virtual environment, you do not need the --user
option:
pip install --user --upgrade nox
To run the sessions defined in nox.options.sessions
(see noxfile.py
), simply run:
nox
To execute a specific nox session, the following will do that:
nox -s [SESSION/FUNCTION NAME]
Currently, the following Nox sessions have been created:
black
- this uses Black to format Python code to a pre-defined style.lint
- this uses flake8 with a number of additional plugins (see the includednoxfile.py
to see which plugins are used) to lint the code to keep it Pythonic..flake8
configuresflake8
and the plugins.safety
- this uses safety to check the dependencies (pulled directly from Poetry) for any known vulnerabilities. This session gives the output in a full ASCII style report.tests
- this uses pytest to execute the automated tests intest/
, tests for the database and ICAT backends, and non-backend specific tests. More details about the tests themselves here.
Each Nox session builds an environment using the repo's dependencies (defined using
Poetry) using install_with_constraints()
. This stores the dependencies in a
requirements.txt
-like format temporarily during this process, using the OS' default
temporary location. These files are manually deleted in noxfile.py
(as opposed to
being automatically removed by Python) to minimise any potential permission-related
issues as documented
here.
To make use of Git's ability to run custom hooks, pre-commit is used. Like Nox, Pip is used to install this tool:
pip install --user --upgrade pre-commit
This repo contains an existing config file for pre-commit
(.pre-commit-config.yaml
)
which needs to be installed using:
pre-commit install
When you commit work on this repo, the configured commit hooks will be executed, but only on the changed files. This is good because it keeps the process of committing a simple one, but to run the hooks on all the files locally, execute the following command:
pre-commit run --all-files
As a summary, these are the steps needed to create a dev environment for this repo compressed into a single code block:
# Install pyenv
curl https://pyenv.run | bash
# Paste into ~/.bashrc
export PATH="~/.pyenv/bin:$PATH"
eval "$(pyenv init -)"
eval "$(pyenv virtualenv-init -)"
# Apply changes made in ~/.bashrc
source ~/.bashrc
# Install Python build tools
sudo yum install @development zlib-devel bzip2 bzip2-devel readline-devel sqlite \
sqlite-devel openssl-devel xz xz-devel libffi-devel findutils
# Install different versions of Python and verify they work
pyenv install 3.6.8
python3.6 --version
pyenv install 3.7.7
python3.7 --version
pyenv install 3.8.2
python3.8 --version
pyenv install 3.9.0
python3.9 --version
# Make installed Python versions available to repo
pyenv local 3.6.8 3.7.7 3.8.2 3.9.0
# Install Poetry
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
# Apply changes made to file when installing Poetry
source ~/.poetry/env
# Install API's dependencies
poetry install
# Install Nox
pip install --user --upgrade nox
# Install Pre Commit
pip install --user --upgrade pre-commit
# Install commit hooks
pre-commit install
To run the application, you must first create a config.json
in the same level as config.json.example
. You then need to generate a public/private key pair for the application to use to sign its JWTs. Running ssh-keygen -t rsa -m 'PEM'
and creating passwordless keys should work. By default, the keys are expected to be in keys/
with the names jwt-key
and jwt-key.pub
- however the paths to the private and public keys can be configured in config.json
. There are example keys used for tests in test/keys/
.
Then the api may be started by using python3 -m scigateway_auth.app
The verify
option in config.json
corresponds to what is supplied to the request
calls to the ICAT server. This can be set to multiple different values:
true
: This setsverify=True
and means thatrequests
will verify certificates using it's internal trust store (note: this is not the same as the system trust store). In practice this means only "real" signed certificates will be verified. This is useful for production.false
: This setsverify=False
and thus disables certificate verification. This is useful for dev but should not be used in production."/path/to/CA_BUNDLE"
: this setsverify="/path/to/CA_BUNDLE"
and will allow you to explicitly trust only the specified self signed certificate. This is useful for preprod or production.
It is also possible to run the API inside Docker. The Dockerfile
can be used to build a Docker image which in turn can be used to create a container. The Dockerfile
is configured to create a production image and runs a Gunicorn server on port 8000
when a container is started. Environment variables have also been defined in the Dockerfile
to allow for values to be passed at runtime to future running containers. These values are used by the docker/docker-entrypoint.sh
script to update the config values in the config.json
file. The environment varialbes are:
ICAT_URL
(Default value:http://localhost
)LOG_LOCATION
(Default value:/dev/stdout
)PRIVATE_KEY_PATH
(Default value:keys/jwt-key
)PUBLIC_KEY_PATH
(Default value:keys/jwt-key.pub
)MAINTENANCE_CONFIG_PATH
(Default value:maintenance/maintenance.json
)SCHEDULED_MAINTENANCE_CONFIG_PATH
(Default value:maintenance/scheduled_maintenance.json
)VERIFY
(Default value:true
)
To build an image, run:
docker build -t scigateway_auth_image .
To start a container on port 8000
from the image that you just built, run:
docker run -p 8000:8000 --name scigateway_auth_container scigateway_auth_image
If you want to pass values for the environment variables then instead run:
docker run -p 8000:8000 --name scigateway_auth_container --env ICAT_URL=https://127.0.0.1:8181 --env LOG_LOCATION=/datagateway-api-run/logs.log --env VERIFY=false scigateway_auth_image
The project consists of 3 main packages, and app.py. The config, constants and exceptions are in the common
package and the endpoints and authentication logic are in src
. The api is setup in app.py. A directory tree is shown below:
─── scigateway-auth
├── scigateway_auth
│ ├── app.py
│ ├── wsgi.py
│ ├── common
│ │ ├── config.py
│ │ ├── constants.py
│ │ ├── exceptions.py
│ │ └── logger_setup.py
│ ├── src
│ │ ├── auth.py
│ │ └── endpoints.py
│ └── config.json
├── test
│ ├── test_authenticationHandler.py
│ ├── test_ICATAuthenticator.py
│ └── test_requires_mnemonic.py
├── logs.log
├── noxfile.py
├── openapi.yaml
├── poetry.lock
├── pyproject.toml
└── README.md
When in the base directory of this repo, use nox -s tests
to run the unit tests located in test/
.
In the base directory of this repository, there's a file called openapi.yaml
. This follows OpenAPI specifcations to display how this API is implemented, using a technology called Swagger. Go to https://petstore.swagger.io/ and using the text field at the top of the page, paste the link to the raw YAML file inside this repo. Click the explore button to see example snippets of how to use the API. This can be useful to see the valid syntax of the request body's of the POST requests.