Skip to content

Commit

Permalink
doc: correct mis-spellings
Browse files Browse the repository at this point in the history
  • Loading branch information
glitt13 committed Oct 24, 2024
1 parent 0fa4d55 commit 0be6ab2
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 4 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
# Regionalization and Formulation Testing and Selection (RaFTS)

**Description**:
The formulation-selector tool, aka Regionalation and Formulation Testing & Selection (RaFTS), is under development. For more information, see the [Wiki](https://github.com/NOAA-OWP/formulation-selector/wiki).
The formulation-selector tool, aka Regionalization and Formulation Testing & Selection (RaFTS), is under development. For more information, see the [Wiki](https://github.com/NOAA-OWP/formulation-selector/wiki).

As NOAA OWP builds the model-agnostic NextGen framework, the hydrologic modeling community will need to know how to optimally select model formulations and estimate parameter values across ungauged catchments. This problem becomes intractable when considering the unique combinations of current and future model formulations combined with the innumerable possible parameter combinations across the continent. To simplify the model selection problem, we apply an analytical tool that predicts hydrologic formulation performance (Bolotin et al., 2022, Liu et al., 2022) using community-generated data. The regionalization and formulation testing and selection (RaFTS) tool readily predicts how models might perform across catchments based on catchment attributes. This decision support tool is designed such that as the hydrologic modeling community generates more results, better decisions can be made on where formulations would be best suited.

Expand Down Expand Up @@ -143,7 +143,7 @@ Attributes from non-standardized datasets may need to be acquired for RaFTS mode
Run [`flow.install.proc.attr.hydfab.R`](https://github.com/NOAA-OWP/formulation-selector/blob/main/pkg/proc.attr.hydfab/flow/flow.install.proc.attr.hydfab.R) to install the package. Note that a user may need to modify the section that creates the `fs_dir` for their custom path to this repo's directory.

## Usage - `proc.attr.hydfab`
The following is an example script that runs the attribute grabber: [`fs_attrs_grab`](https://github.com/NOAA-OWP/formulation-selector/blob/main/pkg/proc.attr.hydfab/flow/fsds_attrs_grab.R).
The following is an example script that runs the attribute grabber: [`fs_attrs_grab`](https://github.com/NOAA-OWP/formulation-selector/blob/main/pkg/proc.attr.hydfab/flow/fs_attrs_grab.R).

This script grabs attribute data corresponding to locations of interest, and saves those attribute data inside a directory as multiple parquet files. The `proc.attr.hydfab::retrieve_attr_exst()` function may then efficiently query and then retrieve desired data by variable name and comid from those parquet files.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ file_io: # May define {home_dir} for python's '{home_dir}/string_path'.format(ho
- 'save_type': 'netcdf' # Required. Use 'csv' to create a directory structure & save multiple files. May also save as hierarchical files 'netcdf' or 'zarr', or if 'csv' chosen, a directory structure is created
- 'save_loc': 'local' # Required. Use 'local' for saving to a local path via dir_save. Future work will create an approach for 'aws' or other cloud saving methods
- 'dir_base' : '{home_dir}/noaa/regionalization/data/input' # Required. The save location of standardized output
- 'dir_std_base' : '{dir_base}/user_data_std' # Required. The location of standardized data generated by proc_fs python package
- 'dir_std_base' : '{dir_base}/user_data_std' # Required. The location of standardized data generated by fs_proc python package
- 'dir_db_hydfab' : '{dir_base}/hydrofabric' # Required. The local dir where hydrofabric data are stored (limits the total s3 connections)
- 'dir_db_attrs' : '{dir_base}/attributes' # Required. The parent dir where each comid's attribute parquet file is stored in the subdirectory 'comid/', and each dataset's aggregated parquet attributes are stored in the subdirectory '/{dataset_name}
formulation_metadata:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ file_io: # May define {home_dir} for python's '{home_dir}/string_path'.format(ho
- 'save_type': 'netcdf' # Required. Use 'csv' to create a directory structure & save multiple files. May also save as hierarchical files 'netcdf' or 'zarr', or if 'csv' chosen, a directory structure is created
- 'save_loc': 'local' # Required. Use 'local' for saving to a local path via dir_save. Future work will create an approach for 'aws' or other cloud saving methods
- 'dir_base' : '{home_dir}/noaa/regionalization/data/input' # Required. The save location of standardized output
- 'dir_std_base' : '{dir_base}/user_data_std' # Required. The location of standardized data generated by proc_fs python package
- 'dir_std_base' : '{dir_base}/user_data_std' # Required. The location of standardized data generated by fs_proc python package
- 'dir_db_hydfab' : '{dir_base}/hydrofabric' # Required. The local dir where hydrofabric data are stored (limits the total s3 connections)
- 'dir_db_attrs' : '{dir_base}/attributes' # Required. The parent dir where each comid's attribute parquet file is stored in the subdirectory 'comid/', and each dataset's aggregated parquet attributes are stored in the subdirectory '/{dataset_name}
formulation_metadata:
Expand Down

0 comments on commit 0be6ab2

Please sign in to comment.