Data Information System (DAISY) is a data bookkeeping application designed to help Biomedical Research institutions with their GDPR compliance.
For more information, please refer to the official Daisy documentation.
DAISY was published as an article DAISY: A Data Information System for accountability under the General Data Protection Regulation in GigaScience journal.
You are encouraged to try Daisy for yourself using our DEMO deployment.
- docker: https://docs.docker.com/install/
-
Get the source code
git clone [email protected]:elixir-luxembourg/daisy.git cd daisy
-
Create your settings file
cp elixir_daisy/settings_local.template.py elixir_daisy/settings_local.py
Optional: edit the file elixir_daisy/settings_local.py to adapt to your environment.
-
Build daisy docker image
docker-compose up --build
Wait for the build to finish and keep the process running
-
Open a new shell and go to daisy folder
-
Build the database
docker-compose exec web python manage.py migrate
-
Build the solr schema
docker-compose exec web python manage.py build_solr_schema -c /solr/daisy/conf -r daisy -u default
-
Compile and deploy static files
cd web/static/vendor npm run build cd ../../../ docker-compose exec web python manage.py collectstatic
-
Create initial data in the database
docker-compose exec web bash -c "cd core/fixtures/ && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/edda.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hpo.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hdo.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hgnc.json" docker-compose exec web python manage.py load_initial_data
Initial data includes, for instance, controlled vocabularies terms and initial list of institutions and cohorts.
This step can take several minutes to complete -
Load demo data
docker-compose exec web python manage.py load_demo_data
This will create mock datasets, projects and create a demo admin account.
-
Optional - import users from an active directory instance
docker-compose exec web python manage.py import_users
-
Build the search index
docker-compose exec web python manage.py rebuild_index -u default
-
Browse to https://localhost
a demo admin account is available:username: admin password: demo
pip install black==23.7.0 pre-commit install black --check . black .
In addition to loading of initial data, DAISY database can be populated by importing Project, Dataset and Partners records from JSON files using commands import_projects
, import_datasets
and import_partners
respectively.
The commands for import are accepting one JSON file (flag -f
):
docker-compose exec web python manage.py <COMMAND> -f ${PATH_TO_JSON_FILE}
where ${PATH_TO_JSON_FILE} is the path to a json file containing the records definitions. See file daisy/data/demo/projects.json as an example.
Alternatively, you can specify directory containing multiple JSON files to be imported with -d
flag:
docker-compose exec web python manage.py <COMMAND> -d ${PATH_TO_DIR}
Information in the DAISY database can be exported to JSON files. The command for export are given below:
docker-compose exec web python manage.py export_partners -f ${JSON_FILE}
where ${JSON_FILE} is the path to a json file that will be produced. In addition to export_partners
, you can run export_projects
and export_datasets
in the same way.
-
Create a database backup.
docker-compose exec db pg_dump daisy --port=5432 --username=daisy --no-password --clean > backup_`date +%y-%m-%d`.sql
-
Make sure docker containers are stopped.
docker-compose stop
-
Get last Daisy release.
git checkout master git pull
-
Rebuild and start the docker containers.
docker-compose up --build
Open a new terminal window to execute the following commands.
-
Update the database schema.
docker-compose exec web python manage.py migrate
-
Update the solr schema.
docker-compose exec web python manage.py build_solr_schema -c /solr/daisy/conf -r daisy -u default
-
Collect static files.
docker-compose exec web python manage.py collectstatic
-
Rebuild the search index.
docker-compose exec web python manage.py rebuild_index -u default
-
Reimport the users (optional).
If LDAP was used during initial setup to import users, they have to be imported again:
docker-compose exec web python manage.py import_users
See DEPLOYMENT.
To be completed.
./manage.py import_users
Single file mode:
./manage.py import_projects -f path/to/json_file.json
Batch mode:
./manage.py import_projects -d path/to/dir/with/json/files/
Available commands: import_projects
, import_datasets
, import_partners
.
In case of problems, add --verbose
flag to the command, and take a look inside ./log/daisy.log
.
cd web/static/vendor/
npm ci
cd web/static/vendor
npm run-script build
./manage.py runserver
The following command will install the test dependencies and execute the tests:
python setup.py pytest
run test for a specific file:
python setup.py pytest --addopts web/tests/test_dataset.py
If tests dependencies are already installed, one can also run the tests just by executing:
pytest
To get access to the admin page, you must log in with a superuser account.
On the Users
section, you can give any user a staff
status and he will be able to access any project/datasets.
Key | Description | Expected values | Example value |
---|---|---|---|
COMPANY |
A name that is used to generate verbose names of some models | str | 'LCSB' |
DEMO_MODE |
A flag which makes a simple banneer about demo mode appear in About page | bool | False |
INSTANCE_LABEL |
A name that is used in navbar header to help differentiate different deployments | str | 'Staging test VM' |
INSTANCE_PRIMARY_COLOR |
A color that will be navbar header's background | str of a color | '#076505' |
LOGIN_USERNAME_PLACEHOLDER |
A helpful placeholder in login form for logins | str | '@uni.lu' |
LOGIN_PASSWORD_PLACEHOLDER |
A helpful placeholder in login form for passwords | str | 'Hint: use your AD password' |
Key | Description | Expected values | Example value |
---|---|---|---|
IDSERVICE_FUNCTION |
Path to a function (lambda: str ) that generates IDs for entities which are published |
str | 'web.views.utils.generate_elu_accession' |
IDSERVICE_ENDPOINT |
In case LCSB's idservice function is being used, the setting contains the IDservice's URI | str | 'https://192.168.1.101/v1/api/ |
Key | Description | Expected values | Example value |
---|---|---|---|
REMS_INTEGRATION_ENABLED |
A feature flag for REMS integration. In practice, there's a dedicated endpoint which processes the information from REMS about dataset entitlements | str | True |
REMS_SKIP_IP_CHECK |
If set to True , there will be no IP checking if the request comes from trusted REMS instance. |
bool | False |
REMS_ALLOWED_IP_ADDRESSES |
A list of IP addresses that should be considered trusted REMS instances. Beware of configuration difficulties when using reverse proxies. The check can be skipped with REMS_SKIP_IP_CHECK |
dict[str] | ['127.0.0.1', '192.168.1.101'] |
Key | Description | Expected values | Example value |
---|---|---|---|
KEYCLOAK_INTEGRATION |
A feature flag for importing user information from Keycloak (OIDC IDs) | bool | True |
KEYCLOAK_URL |
URL to the Keycloak instance | str | 'https://keycloak.lcsb.uni.lu/auth/' |
KEYCLOAK_REALM_LOGIN |
Realm's login name in your Keycloak instance | str | 'master' |
KEYCLOAK_REALM_ADMIN |
Realm's admin name in your Keycloak instance | str | 'master' |
KEYCLOAK_USER |
Username to access Keycloak | str | 'username' |
KEYCLOAK_PASS |
Password to access Keycloak | str | 'secure123' |
Key | Description | Expected values | Example value |
---|---|---|---|
SERVER_SCHEME |
A URL's scheme to access your DAISY instance (http or https) | str | 'https' |
SERVER_URL |
A URL to access your DAISY instance (without the scheme) | str | 'example.com' |
GLOBAL_API_KEY |
An API key that is not connected with any user. Disabled if set to None |
optional[str] | 'in-practice-you-dont-want-to-use-it-unless-debugging' |