Skip to content

Commit

Permalink
Dataset/Add-Documentation (#92)
Browse files Browse the repository at this point in the history
* reformat files

* test notebooks

* update hooks

* update update dataset notebook

* add nbval to test notebooks

* remove conda configs from pypi workflow

* test notebooks as part of conda workflow

* activate shell

* update notebooks

* update workflow

* add python to run the test

* activate environment

* remove the setup python to use only the conda env

* remove the separate setup python and add install the package itself

* ignore one of the cells

* ignore one of the cells

* add doc string to set_attribute_table, get_Attribute_table, create methods

* add example to add_band

* test add_band inplace while in read_only access

* correct indentation errors in dataset.rst file

* test trying to add id array as a band

* test trying to add different dimension array ad a new band

* fix error in correct_wrap_cutline_error with multi-band datasets

* remove the _crop_with_polygon_by_rasterizing method

* remove the type from the abstract_Dataset.create_from_array

* if the mask dataset has multiple bands, use the first band as a mask

* rename tests

* add documentations to the crop method

* add hint on the multi-band mask dataset

* remove the rows and columns if they are all full of no_data_value when cropping with a dataset

* test cropping dataset using a multi-band dataset

* add docstring to the stats method

* add docstring to `create_from_array` and add `cell_size` and `top_left_corner` parameter

* test the geo/top_left_corner cases of the create_from_array method

* update docstrings

* add examples to the `read_array` method

* rename `pivot_point` to `top_left_corner`

* clean un-needed commented code

* add examples to get_block_arrangement

* fix returned dtype from from the dtypes

* remove the id as it differs in linux

* ad docstring for `plot`

* add example to `dataset_like`

* the `dataset_like` method returns the `dataset` object even if the path is given

* test `dataset_like` for multi-band

* correct typos and docstring

* update the new name for the`pivot_point` to `top_left_corner`

* flush data to disk in the `_create_gtiff_from_array` if path is given

* rename the `pivot_point` parameter in the `write_array` method to `top_left_corner`

* use the coello geotiff file in set_crs test

* test the `epsg` property

* add docstring

* fix docstring for `write_array`

* add examples and modify docstring

* the `to_file` method initialize the dataset to refer to the saved dataset on disk

* use neighbor instead of neighbour

* use neighbor instead of neighbour

* add examples for `to_crs`

* add examples to footprint

* fix docstring for the `block_size`

* fix typo in comments

* fix docstring for the `apply`

* fix docstring for the `apply`

* add example for the `align` method

* flush data at the end of the `to_file` method

* the `_window` function iterate over rows instead of columns

* add examples for the `to_feature_collection` method

* reformat

* test `_nearest_neighbour`

* rerun notebook

* reformat

* add pictures for docs

* add pictures for docs

* add doctest result files to gitignore

* add examples for thr `get_tiles` method

* remove `pyramids` from the docs/environment.yml to correctly install the package from the repo

* add geopandas to doc/environment.yml

* install pyramids in the .readthedocs.py file

* add gdal to docs dependencies

* install numpy as part of the docs/environment.yml

* install the package itself using conda

* install the package itself using conda

* update dependencies

* update dependencies

* add test for `cluster` method

* add example for `cluster`

* add image to the `get_tile` example

* correct the example of the `footprint` method

* reformat

* add example to `cluster2`

* correct versions

* fix missing indentation issue in docstring

* add examples to all methods related to the overviews

* update notebooks

* add examples for the `get_histogram` method

* clean file

* fix error in the `color_table` property

* mark the `color_table` test as plot test

* add example to `color_table`

* add hook to check doctest

* add examples to the `band_color`, and `get_band_by_color`

* all the create methods assign the access property properly.

* add xml code block in all docstring

* add examples to `add_band` and `create` methods

* reformat

* change the path to the image in the docstring

* change the path to the image

* change all images paths

* fix errors in the docstring

* crop images and fix docstring

* fix docstring

* fix docstring

* fix docstring

* fix docstring

* add examples to `map_to_array_coordinates`

* convert the `array_to_map_coordinates` to normal method

* add examples to the `bbox` and the `bounds` properties

* rename the top_left_coords to top_left_corner

* add inplace parameter to the method `change_no_data_value`

* test inplace=False

* put a warning in the `change_no_data_value` method

* correct the warning statment

* re indent the image in `cluster` method

* add graphviz as a system package in the readthedocs file

* add graphviz as a system package in the readthedocs file

* update readthedoc config file

* correct the graphviz setting in the readthedoc file

* correct docstring issues

* remvove the sphinxcontrib-graphviz from doc dependencies

* add graphviz as a normal package

* add type hints

* correct docstring issues

* fix docsting issues

* correct the warning directive

* change the warning clause.

* add examples for the `color_table`

* add `close` method and add examples to the `copy` dataset

* fix docstring issues

* correct docstring in `read_array`

* add `band` parameter to the `extract` method

* add `band` parameter to the `extract` method

* test `close` method

* fix docstring

* add examples for the `overlay` method

* remove `driver` from the fill method

* add images to the `resample` method

* fix docstring

* add inplace to the `fill` method

* add examples to `get_cell_points`, `get_cell_coords`, and `get_cell_polygons`

* add examples to `lat`, `lon`, `x`, and `y`

* add images to the docstring

* crop images

* add see also section in the docstring

* fix the xml code block

* fix indentation

* add south america notebook

* add missing south-america-mswep_1979010100.tif.aux.xml missing file

* update history file

* fix docstring

* fix docstring

* update history file

* update version to 0.7.0

* update history fille
  • Loading branch information
MAfarrag authored Jul 12, 2024
1 parent 5f9641a commit 1dd2e24
Show file tree
Hide file tree
Showing 64 changed files with 9,600 additions and 2,760 deletions.
23 changes: 10 additions & 13 deletions .github/workflows/conda-deployment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest]
python-version: ["3.11"]
python-version: ["3.12"]
env:
OS: ${{ matrix.os }}

Expand All @@ -26,11 +26,6 @@ jobs:
channels: conda-forge,defaults
channel-priority: true
show-channel-urls: true
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
architecture: x64

- name: Install dev-dependencies
run: |
Expand All @@ -50,7 +45,7 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest]
python-version: ["3.11"]
python-version: ["3.12"]
env:
OS: ${{ matrix.os }}

Expand All @@ -69,21 +64,23 @@ jobs:
channel-priority: true
show-channel-urls: true

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
architecture: x64

- name: Install dev-dependencies
run: |
python -m pip install -r requirements-dev.txt
- name: Generate coverage report
shell: bash -el {0}
run: |
conda activate test
conda info
conda list
conda config --show-sources
conda config --show
pytest -sv
- name: test Jupyter notebook
shell: bash -el {0}
run: |
conda activate test
pip install -e .
pytest --nbval -k ".ipynb"
20 changes: 6 additions & 14 deletions .github/workflows/pypi-deployment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,12 @@ name: pypi-deployment
on: [push]

jobs:
Main-Package:
main-package:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest]
python-version: ["3.9", "3.10", "3.11"]
python-version: ["3.11", "3.12"]
env:
OS: ${{ matrix.os }}

Expand All @@ -24,7 +24,7 @@ jobs:
run: |
python -m pip install --upgrade pip
pip install --no-cache-dir Cython
pip install --find-links=https://girder.github.io/large_image_wheels --no-cache GDAL==3.8.4
pip install --find-links=https://girder.github.io/large_image_wheels --no-cache GDAL==3.9.0
- name: Test GDAL installation
run: |
Expand All @@ -38,21 +38,17 @@ jobs:
- name: Generate coverage report
run: |
conda info
conda list
conda config --show-sources
conda config --show
python -m pytest -sv -m "not plot"
- name: Upload coverage reports to Codecov with GitHub Action
uses: codecov/codecov-action@v3

Optional-packages:
optional-packages:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ ubuntu-latest]
python-version: ["3.9", "3.10", "3.11"]
python-version: ["3.11", "3.12"]
env:
OS: ${{ matrix.os }}

Expand All @@ -68,7 +64,7 @@ jobs:
run: |
python -m pip install --upgrade pip
pip install --no-cache-dir Cython
pip install --find-links=https://girder.github.io/large_image_wheels --no-cache GDAL==3.8.4
pip install --find-links=https://girder.github.io/large_image_wheels --no-cache GDAL==3.9.0
- name: Test GDAL installation
run: |
Expand All @@ -82,10 +78,6 @@ jobs:
- name: Generate coverage report
run: |
conda info
conda list
conda config --show-sources
conda config --show
python -m pytest -v --cov=pyramids --cov-report=xml
- name: Upload coverage reports to Codecov with GitHub Action
Expand Down
3 changes: 2 additions & 1 deletion .github/workflows/pypi-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,12 @@
name: pypi-release

on:
workflow_dispatch:
release:
types: [created]

jobs:
deploy:
publish:
runs-on: ubuntu-latest

steps:
Expand Down
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -168,3 +168,5 @@ tests/data/not-used/*
/tests/data/delete/delta1.gpkg
/tests/data/delete/era5-mask.qmd
/tests/data/delete/mask.gpkg
/write_array.tif
/my-dataset.tif
18 changes: 17 additions & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ repos:
files: ^pyramids/

- repo: https://github.com/pycqa/flake8
rev: 7.0.0
rev: 7.1.0
hooks:
- id: flake8
name: "[py - check] flake8"
Expand Down Expand Up @@ -112,3 +112,19 @@ repos:
language: system
pass_filenames: false
always_run: true

- repo: local
hooks:
- id: examples-notebook-check
name: nbval
entry: pytest --nbval
language: system
files: \.ipynb$

- repo: local
hooks:
- id: doctest
name: doctest
entry: pytest --doctest-modules #pyramids/dataset.py
language: system
files: \.py$
2 changes: 2 additions & 0 deletions .readthedocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ build:
os: "ubuntu-22.04"
tools:
python: "mambaforge-22.9"
apt_packages:
- graphviz

conda:
environment: docs/environment.yml
Expand Down
15 changes: 15 additions & 0 deletions HISTORY.rst
Original file line number Diff line number Diff line change
Expand Up @@ -206,6 +206,11 @@ dataset.
[xoff, yoff, x-window, y-window], the `window` can also be a geodataframe.
* add `get_block_arrangement` method divide the raster into tiles based on the block size.
* add tiff file writing options (compression/tile/tile_length)
* add `close` method to flush to desk and close a dataset.
* add `add_band` method to add an array as a band to an existing dataset.
* rename `pivot_point` to `top_left_corner` in the `create` method.
* the `to_file` method return a `Dataset` object pointing to the saved dataset rather than the need to re-read the
saved dataset after you save it.

Datacube
""""""""
Expand All @@ -214,3 +219,13 @@ Datacube
NetCDF
"""""""
* move all the netcdf related functions to a separate module `netcdf`.

FeatureCollection
"""""""""""""""""
* rename the `pivot_point` to `top_left_corner`

Deprecated
""""""""""
*Cropping a raster using a polygon is done now directly using gdal.wrap nand the the `_crop_with_polygon_by_rasterizing`
is deprecated.
* rename the interpolation method `nearest neighbour` to `nearest neighbor`.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Installing pyramids
Installing `pyramids` from the `conda-forge` channel can be achieved by:

```
conda install -c conda-forge pyramids=0.6.0
conda install -c conda-forge pyramids=0.7.0
```

It is possible to list all the versions of `pyramids` available on your platform with:
Expand All @@ -68,7 +68,7 @@ pip install git+https://github.com/Serapieum-of-alex/pyramids
to install the last release, you can easily use pip

```
pip install pyramids-gis==0.6.0
pip install pyramids-gis==0.7.0
```

Quick start
Expand Down
6 changes: 5 additions & 1 deletion docs/environment.yml
Original file line number Diff line number Diff line change
@@ -1,10 +1,14 @@
name: docs
channels:
- conda-forge
dependencies:
- pyramids
- numpy >=2.0.0
- numpydoc==1.1.0
- typing-extensions==3.10.*
- gdal==3.9.0
- pip:
- pydata_sphinx_theme>=0.15.2
- sphinxcontrib-napoleon
- nbsphinx
- graphviz
- ../.
Binary file added docs/source/_images/dataset/align-result.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/source/_images/dataset/bounds.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/cluster.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/get_cell_points.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/get_cell_polygons.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/get_tile.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/overviews-level-0.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/overviews-level-1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/overviews-level-2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/resample-new.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/resample-source.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/rhine-classes.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_images/dataset/rhine-rainfall.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/source/_images/dataset/rhine_dem.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
28 changes: 18 additions & 10 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,21 +46,26 @@
# Set the theme name
# Optionally, you can customize the theme's configuration
html_theme_options = {
"logo_link": "index",
"icon_links": [
{
"name": "GitHub",
"url": "https://github.com/Serapieum-of-alex/pyramids",
"icon": "fab fa-github-square",
},
],
"navbar_end": ["search-field.html", "navbar-icon-links"],
"search_bar_text": "Search this site...",
"navbar_align": "content",
"navigation_depth": 4,
"show_prev_next": False,
"show_toc_level": 2,
# Add any other theme options here
"canonical_url": "",
"logo_only": False,
"display_version": True,
"prev_next_buttons_location": "bottom",
"style_external_links": False,
"style_nav_header_background": "#2980B9",
# Toc options
"collapse_navigation": True,
"sticky_navigation": True,
"includehidden": True,
"titles_only": False,
# "external_links": [
# {"name": "External Link", "url": "https://example.com"},
# ],
"header_links_before_dropdown": 4,
}

html_static_path = ["_static"]
Expand All @@ -77,3 +82,6 @@
# -- Options for autodoc -----------------------------------------------------
napoleon_google_docstring = True
napoleon_numpy_docstring = True

# Ensure that the path to the Graphviz `dot` command is correct
graphviz_dot = "dot"
6 changes: 3 additions & 3 deletions docs/source/dataset.rst
Original file line number Diff line number Diff line change
Expand Up @@ -540,9 +540,9 @@ Blocksize and ReadAsArray
When you know the block size, you can more effectively plan and execute data processing tasks:
- Data Reading/Writing: When reading or writing data, doing so in multiples of the block size can reduce the number of
disk accesses required, as each access operation will align with the blocks on disk.
disk accesses required, as each access operation will align with the blocks on disk.
- Optimizations: Some formats are optimized for specific block sizes, or for being accessed in certain ways. For
example, tiled TIFFs might perform better with square block sizes.
example, tiled TIFFs might perform better with square block sizes.
.. code:: py
Expand Down Expand Up @@ -889,7 +889,7 @@ Crop raster using array
dst_cropped = Raster.cropAlligned(dst, arr, mask_noval=nodataval)
Map.plot(dst_cropped, title="Cropped array", color_scale=1, ticks_spacing=0.01)
.. image:: /_images/crop_raster_using_array.png
.. image:: /_images/dataset/crop_raster_using_array.png
:width: 500pt
crop
Expand Down
16 changes: 8 additions & 8 deletions environment-optional-packages.yml
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
channels:
- conda-forge
dependencies:
- python >=3.11
- numpy >=1.25.2
- hpc >=0.1.3
- pip >=23.3.1
- gdal ==3.8.4
- numpy >=2.0.0
- hpc >=0.1.4
- pip >=24.0.0
- gdal ==3.9.0
- pandas >=2.1.0
- geopandas >=0.14.1
- Shapely >=2.0.2
- pyproj >=3.6.1
- cleopatra >=0.4.0
- cleopatra >=0.4.2
- PyYAML >=6.0.1
- loguru >=0.7.2
- pytest >=7.4.3
- pytest-cov >=4.1.0
- pytest >=8.2.2
- pytest-cov >=5.0.0
- nbval >=0.11.0
14 changes: 7 additions & 7 deletions environment.yml
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
channels:
- conda-forge
dependencies:
- python >=3.11
- numpy >=1.25.2
- hpc >=0.1.3
- pip >=23.3.1
- gdal ==3.8.4
- numpy >=2.0.0
- hpc >=0.1.4
- pip >=24.0.0
- gdal ==3.9.0
- pandas >=2.1.0
- geopandas >=0.14.1
- Shapely >=2.0.2
- pyproj >=3.6.1
- PyYAML >=6.0.1
- loguru >=0.7.2
- pytest >=7.4.3
- pytest-cov >=4.1.0
- pytest >=8.2.2
- pytest-cov >=5.0.0
- nbval >=0.11.0
Loading

0 comments on commit 1dd2e24

Please sign in to comment.