Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example of running PyDDA in HPC ? #56

Open
sysumeteo opened this issue Apr 26, 2020 · 3 comments
Open

Example of running PyDDA in HPC ? #56

sysumeteo opened this issue Apr 26, 2020 · 3 comments

Comments

@sysumeteo
Copy link

sysumeteo commented Apr 26, 2020

The example of nested wind retrieval in the doc is based on LocalCluster. Is PyDDA designed to be run on HPC such as Summit or TianHe-2? If yes , i think it would be really helpful if there is a example about the best strategy of spliting the grid and distributing the computations to workers under Dask in consideration of maximizing CPU usage and balancing the time of IO, including the setting of the number of jobs/n_workers/processess etc .

@rcjackson
Copy link
Collaborator

It can be run on an HPC cluster using dask distrubted! My best strategy has been to dedicate an entire node to one worker since the optimizer will use the cores available when doing the calculation. You from there could then use multiple nodes.

@sysumeteo
Copy link
Author

sysumeteo commented May 6, 2020

It can be run on an HPC cluster using dask distrubted! My best strategy has been to dedicate an entire node to one worker since the optimizer will use the cores available when doing the calculation. You from there could then use multiple nodes.

Thanks! Another one i would like to know is that, is there a recommend for the setting of 'num_split'? I think a larger one will split the whole grid into more subgrids then we can use more nodes for calculation ,but more subgrids will cost much more time on processing the subgrid temp files or on IO between calcualtion nodes. Is that true? IF TRUE, is there a best 'num_split' that can banlance this?

@rcjackson
Copy link
Collaborator

rcjackson commented May 6, 2020 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants