Skip to content

Commit

Permalink
Merge pull request #15 from abhimhamane/106-prereview
Browse files Browse the repository at this point in the history
updated contribution, load_data with new approach and changed the flo…
  • Loading branch information
abhimhamane authored Apr 26, 2024
2 parents ace2c69 + 939ab48 commit bbb0423
Show file tree
Hide file tree
Showing 5 changed files with 93 additions and 79 deletions.
128 changes: 61 additions & 67 deletions docs/contributing.md
Original file line number Diff line number Diff line change
@@ -1,108 +1,102 @@
# Contributing

Contributions are welcome, and they are greatly appreciated! Every
little bit helps, and credit will always be given.
Contributions are welcome, and they are greatly appreciated! Every
little bit helps, and credit will always be given.

You can contribute in many ways:
You can contribute in many ways.

## Types of Contributions

### Report Bugs

Report bugs at <https://github.com/mn5hk/pyshbundle/issues>.
Report bugs at <https://github.com/mn5hk/pyshbundle/issues>.

If you are reporting a bug, please include:
If you are reporting a bug, please include:

- Your operating system name and version.
- Any details about your local setup that might be helpful in troubleshooting.
- Detailed steps to reproduce the bug.
- Your operating system name and version.
- Any details about your local setup that might be helpful in troubleshooting.
- Detailed steps to reproduce the bug.

### Fix Bugs
### Found Bugs!!!

Look through the GitHub issues for bugs. Anything tagged with `bug` and
`help wanted` is open to whoever wants to implement it.
Look through the GitHub issues for bugs. Anything tagged with `bug` and
`help wanted` is open to whoever wants to implement it.

### Implement Features
### Want New Functionality?

Look through the GitHub issues for features. Anything tagged with
`enhancement` and `help wanted` is open to whoever wants to implement it.
Look through the GitHub issues for features. Anything tagged with
`enhancement` and `help wanted` is open to whoever wants to implement it.

### Write Documentation
### Let's Improve Documentation

pyshbundle could always use more documentation,
whether as part of the official pyshbundle docs,
in docstrings, or even on the web in blog posts, articles, and such.
pyshbundle could always use more documentation,
whether as part of the official pyshbundle docs,
in docstrings, or even on the web in blog posts, articles, and such.<br>

### Submit Feedback
Another prefered way is to create explainatory tutorials. Using the `PySHBundle` functions to explain the `Spherical Harmonics` and `GRACE Data Processing`.

The best way to send feedback is to file an issue at
<https://github.com/mn5hk/pyshbundle/issues>.
### Feedback

If you are proposing a feature:
The best way to send feedback is to file an issue at
<https://github.com/mn5hk/pyshbundle/issues>.

- Explain in detail how it would work.
- Keep the scope as narrow as possible, to make it easier to implement.
- Remember that this is a volunteer-driven project, and that contributions are welcome :)
If you are proposing a feature:

- Explain in detail how it would work.
- Keep the scope as narrow as possible, to make it easier to implement.
- Remember that this is a volunteer-driven project, and that contributions are welcome :)

## Get Started!

Ready to contribute? Here's how to set up pyshbundle for local development.
Ready to contribute? Here's how to set up pyshbundle for local development.

1. Fork the pyshbundle repo on GitHub.
1. Fork the pyshbundle repo on GitHub.

2. Clone your fork locally:
2. Setup a seperate development environment.
```shell
# clone the repo and fetch the dev branch
$ git clone [email protected]:mn5hk/pyshbundle.git

```shell
$ git clone [email protected]:your_name_here/pyshbundle.git
```
# creating a new virtual environment
$ python3 -m venv <name-env>

3. Install your local copy into a virtualenv. Assuming you have
virtualenvwrapper installed, this is how you set up your fork for
local development:
# install the dependencies from the requirements-dev file
$ pip install -r ../pyshbundle/requirements-dev.txt

```shell
$ mkvirtualenv pyshbundle
$ cd pyshbundle/
$ python setup.py develop
```
# activate the virtual environment environment
$ source </location-of-virt-env/name-env/bin/activate>
```

4. Create a branch for local development:
3. Build the latest repo in the development virtual environment.
```shell
# install package into virtual environment
$ pip install ../pyshbundle/dist/<required-version>.tar.gz

```shell
$ git checkout -b name-of-your-bugfix-or-feature
```
# you also have the option to build the module using, be careful of
$ python setup.py sdist
```

Now you can make your changes locally.

5. When you're done making changes, check that your changes pass flake8
and the tests, including testing other Python versions with tox:
4. Create a branch for local development:

```shell
$ flake8 pyshbundle tests
$ python setup.py test or pytest
$ tox
```
```shell
$ git checkout -b name-of-your-bugfix-or-feature
```

To get flake8 and tox, just pip install them into your virtualenv.
Now you can make your changes lo cally.

6. Commit your changes and push your branch to GitHub:
5. Commit your changes and push your branch to GitHub:

```shell
$ git add .
$ git commit -m "Your detailed description of your changes."
$ git push origin name-of-your-bugfix-or-feature
```
```shell
$ git add .
$ git commit -m "Your detailed description of your changes."
$ git push origin name-of-your-bugfix-or-feature
```

7. Submit a pull request through the GitHub website.
6. Submit a pull request through the GitHub website.

## Pull Request Guidelines

Before you submit a pull request, check that it meets these guidelines:
Before you submit a pull request, check that it meets these guidelines:

1. The pull request should include tests.
2. If the pull request adds functionality, the docs should be updated.
Put your new functionality into a function with a docstring, and add
the feature to the list in README.rst.
3. The pull request should work for Python 3.5, 3.6, 3.7 and 3.8, and
for PyPy. Check <https://github.com/mn5hk/pyshbundle/pull_requests> and make sure that the tests pass for all
supported Python versions.
TBD...
Binary file added docs/img/01_flowchart_20240327.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed docs/img/01_flowchart_without_background.png
Binary file not shown.
36 changes: 28 additions & 8 deletions docs/load_data.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,29 +6,49 @@ There are 3 major research centers which disseminate GRACE data. These are:

+ [University of Texas at Austin, Center for Space Research (CSR)](https://www.csr.utexas.edu/)
+ [Jet Propulsion Laboratory (JPL)](https://grace.jpl.nasa.gov/)
+ [German Research Center for Geosciences (GFZ)](https://www.gfz-potsdam.de/en/)
+ [GeoForschungsZentrum Potsdam (GFZ)](https://www.gfz-potsdam.de/en/)

## GRACE Data Levels

GRACE data is available at different processing levels:
The `GRACE` data products are being developed, processed and archieved in a shared Science Data System between the `Jet Propulsion Laboratory(JPL)`, the `University of Texas Center for Space Research (UT-CSR)` and `GeoForschungsZentrum Potsdam (GFZ)`.

+ Level 1: Raw satellite data
* Level 1A:
* Level 1B:
+ Level 2: Spherical Harmonic Coefficients for the geopotential estimates
+ Level 3: consists of mass anomalies or other standardized products such as Monthly Ocean/Land Water Equivalent Thickness, Surface-Mass Anomaly. Similarly mass concentration blocks or `mascons` are also available.
+ Level 4: Time-series of catchment level hydrological estimates of TWSA
+ Level 0:<br>
The level-0 data are the result of the data reception, collection and decommutation by the Raw Data Center (RDC) of the Mission Operation System (MOS) located in Neustrelitz, Germany. The MOS receives twice per day using its Weilheim and Neustrelitz tracking antennae the science instrument and housekeeping data from each GRACE satellite which will be stored in two appropriate files in the level-0 rolling archive at DFD/Neustrelitz. The SDS retrieves these files and extracts and reformats the orresponding instrument and ancillary housekeeping data like GPS navigation solutions,space segment temperatures or thruster firing events. Level-0 products are available 24-hours after data reception.

+ Level 1:<br>
The level-1 data are the preprocessed, time-tagged and normal-pointed instrument data. These are the K-band ranging, accelerometer, star camera and GPS data of both satellites. Additionally the preliminary orbits of both GRACE satellites will be generated. Level-1 data processing software is developed by JPL with support from GFZ (e.g. accelerometer data preprocessing). Processing of level-1 products is done primarily at JPL. An identical processing system (hardware/software) is installed at GFZ to serve as a backup system in case of hardware or network problems. This double implementation is necessary to guarantee the envisaged level-1 product delay of 5 days. All level-1 products are archived at JPL’s Physical Oceanography Distributed Active Data Center(PODAAC) and at GFZ’s Integrated System Data Center (ISDC) . Both archives are harmonized on a sub-daily timeframe.

+ Level 2: `Spherical Harmonic Coefficients` for the geopotential

Level-2 data include the short term (30 days) and mean gravity field derived from calibrated and validated GRACE level-1 data products. This level also includes ancillary data sets (temperature and pressure fields, ocean bottom pressure, and hydrological data) which are necessary to eliminate time variabilities in gravity field solutions. Additionally the precise orbits of both GRACE satellites are generated. All level-2 products are archived at JPL’s PODAAC and at GFZs ISDC and are available 60 days after data taking. The level-2 processing software were developed independently by all three processing centres using already existing but completely independent software packages which were upgraded for GRACE specific tasks. Common data file interfaces guarantees a strong product validation. Routine processing is done at UTCSR and GFZ, while JPL only generate level-2 products at times for verification purposes.

+ Level 3: `Mascons`<br>
consists of mass anomalies or other standardized products such as Monthly Ocean/Land Water Equivalent Thickness, Surface-Mass Anomaly. Similarly mass concentration blocks or `mascons` are also available.

+ Level 4: `Time Series`<br>
Time-series of catchment level hydrological estimates of TWSA

`PySHBundle` provides the capability to obtain grided Total Water Storage Anomaly(TWSA) from Level 2 data.

## Pre-Processing of Data

### Older Approach
::: pyshbundle.read_GRACE_SH_paths
::: pyshbundle.reader_replacer_jpl.reader_replacer_jpl
::: pyshbundle.reader_replacer_csr.reader_replacer_csr
::: pyshbundle.reader_replacer_itsg.reader_replacer_itsg
::: pyshbundle.reader_replacer.reader

<br>

### Newer Approach
::: pyshbundle.io.read_jpl
::: pyshbundle.io.read_csr
::: pyshbundle.io.read_csr
::: pyshbundle.io.read_tn13
::: pyshbundle.io.read_tn14


## Computing Ling Term Mean
::: pyshbundle.load_longterm_mean

Expand Down
8 changes: 4 additions & 4 deletions docs/pyshbundle.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@

# Reference Mannual - PySHBundle

The module codes can be categorized into four categories:
The module codes can be categorized into four categories:

+ [load data](load_data.md)
+ [Load Data Modules](load_data.md)
+ [convert data formats](convert_data_formats.md)
+ [core functionality](core_functionality.md)
+ [auxillary codes](auxillary_codes.md)

![Schematic diagram of code workflow](img/01_flowchart_without_background.png)
![Schematic diagram of code workflow](img/01_flowchart_20240327.jpg)

Navigate the Reference Manual based on the following schematic
Navigate the Reference Manual based on the following schematic

0 comments on commit bbb0423

Please sign in to comment.