Skip to content

Commit

Permalink
Merge branch 'master' into feature/non-osm
Browse files Browse the repository at this point in the history
  • Loading branch information
kshitijrajsharma authored Oct 9, 2023
2 parents 6a8e83c + 9e44525 commit fb9545d
Show file tree
Hide file tree
Showing 8 changed files with 82 additions and 15 deletions.
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -48,4 +48,5 @@ backend/training/*
trainings/*
backend/.env
backend/config.txt
backend/postgres-data
backend/postgres-data

2 changes: 1 addition & 1 deletion backend/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -35,4 +35,4 @@ RUN pip install /tmp/solaris --use-feature=in-tree-build && \

# Set working directory and copy the application code
WORKDIR /app
COPY . /app
COPY . /app
3 changes: 2 additions & 1 deletion backend/Dockerfile_CPU
Original file line number Diff line number Diff line change
Expand Up @@ -36,4 +36,5 @@ RUN pip install /tmp/solaris --use-feature=in-tree-build && \

# Set working directory and copy the application code
WORKDIR /app
COPY . /app
COPY . /app

4 changes: 3 additions & 1 deletion backend/core/tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,9 @@ def train_model(
try:
## -----------IMAGE DOWNLOADER---------
os.makedirs(settings.LOG_PATH, exist_ok=True)

if training_instance.task_id is None or training_instance.task_id.strip() == '':
training_instance.task_id=train_model.request.id
training_instance.save()
log_file = os.path.join(
settings.LOG_PATH, f"run_{train_model.request.id}_log.txt"
)
Expand Down
4 changes: 1 addition & 3 deletions backend/core/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -173,9 +173,7 @@ def download_imagery(start: list, end: list, zm_level, base_path, source="maxar"
y_value = download_path[1]
source_value = source
download_url = source_value.format(
x=download_path[0], y=y_value, z=zm_level
)

x=download_path[0], y=y_value, z=zm_level)
download_urls.append(download_url)

start_y = start_y - 1 # decrease the y
Expand Down
2 changes: 1 addition & 1 deletion backend/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ django_celery_results==2.4.0
flower==1.2.0
validators==0.20.0
gpxpy==1.5.0
hot-fair-utilities
hot-fair-utilities==1.2.2
geojson2osm==0.0.1
osmconflator
orthogonalizer
Expand Down
73 changes: 66 additions & 7 deletions docs/Docker-installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,11 @@ Docker Compose is created with redis , worker , postgis database , api and fron
3. Check your Graphics
fAIr works best with graphics card. It is highly recommended to use graphics card . It might not work with CPU only . Nvidia Graphics cards are tested
fAIr works best with graphics card. It is highly recommended to use graphics card . It might not work with CPU only (You can setup and test from bottom of this document). Nvidia Graphics cards are tested
You need to make sure you can see your graphics card details and can be accessed through docker by installing necessary drivers
By following command you can see your graphics and graphics driver details
By following command you can see your graphics and graphics driver details & nvidia container toolkit is installed More details [here](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html)
```
nvidia-smi
```
Expand All @@ -34,7 +34,7 @@ Docker Compose is created with redis , worker , postgis database , api and fron
mkdir ramp
```
- Download BaseModel Checkpoint from [here](https://drive.google.com/file/d/1wvJhkiOrSlHmmvJ0avkAdu9sslFf5_I0/view?usp=sharing)
OR You can use basemodel from [Model Ramp Baseline](https://github.com/radiantearth/model_ramp_baseline/tree/main/data/input/checkpoint.tf)
```
pip install gdown
gdown --fuzzy https://drive.google.com/file/d/1wvJhkiOrSlHmmvJ0avkAdu9sslFf5_I0/view?usp=sharing
Expand Down Expand Up @@ -75,6 +75,7 @@ Docker Compose is created with redis , worker , postgis database , api and fron
6. Create Env variables
- Create a file ```.env``` in backend with [docker_sample_env](../backend/docker_sample_env) content
```
cd backend
cp docker_sample_env .env
```
- Fill out the details of ```OSM_CLIENT_ID``` &```OSM_CLIENT_SECRET``` in .env file and generate a unique key & paste it to ```OSM_SECRET_KEY``` (It can be random for dev setup)
Expand All @@ -83,6 +84,7 @@ Docker Compose is created with redis , worker , postgis database , api and fron
- Create ```.env``` in /frontend
```
cd frontend
cp .env_sample .env
```
You can leave it as it is for dev setup
Expand All @@ -99,13 +101,17 @@ Docker Compose is created with redis , worker , postgis database , api and fron
8. Run Migrations
See Running containers grab their ID and launch bash to make migrations (This is needed for the first time to set database)
Run directly bash script :
```
bash run_migrations.sh
```
docker container ps
OR
Grab Container ID & Open Bash
Grab API container & Open Bash
docker exec -it CONTAINER_ID bash
docker exec -it api bash
Once Bash is promoted hit following commands
Expand All @@ -126,6 +132,59 @@ Docker Compose is created with redis , worker , postgis database , api and fron
Frontend will be available on 5000 port , Backend will be on 8000 , Flower will be on 5500
10. Want to run your local tiles ?
You can use [titler](https://github.com/developmentseed/titiler) , [gdals2tiles](https://gdal.org/programs/gdal2tiles.html) or nginx to run your own TMS server and add following to docker compose in order to access your localhost through docker containers . Add those to API and Worker . Make sure you update the .env variable accordingly
```
network_mode: "host"
```
Example docker compose :
```
backend-api:
build:
context: ./backend
dockerfile: Dockerfile_CPU
container_name: api
command: python manage.py runserver 0.0.0.0:8000
ports:
- 8000:8000
volumes:
- ./backend:/app
- ${RAMP_HOME}:/RAMP_HOME
- ${TRAINING_WORKSPACE}:/TRAINING_WORKSPACE
depends_on:
- redis
- postgres
network_mode: "host"
backend-worker:
build:
context: ./backend
dockerfile: Dockerfile_CPU
container_name: worker
command: celery -A aiproject worker --loglevel=INFO --concurrency=1
volumes:
- ./backend:/app
- ${RAMP_HOME}:/RAMP_HOME
- ${TRAINING_WORKSPACE}:/TRAINING_WORKSPACE
depends_on:
- backend-api
- redis
- postgres
network_mode: "host"
```
Example .env after host change :
```
DATABASE_URL=postgis://postgres:admin@localhost:5434/ai
CELERY_BROKER_URL="redis://localhost:6379/0"
CELERY_RESULT_BACKEND="redis://localhost:6379/0"
```
### fAIr setup for CPU :
This is still in test , Currently CPU version can be swamp by
Expand Down
6 changes: 6 additions & 0 deletions run_migrations.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
#!/bin/bash

docker exec -it api bash -c "python manage.py makemigrations"
docker exec -it api bash -c "python manage.py makemigrations login"
docker exec -it api bash -c "python manage.py makemigrations core"
docker exec -it api bash -c "python manage.py migrate"

0 comments on commit fb9545d

Please sign in to comment.