-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Scale prediction of episodic or not to full region #21
Comments
There are 60,829 COMIDs for the state of MN (this is not including upstream ones): library(sf)
library(tidyverse)
library(usmap)
mn_sf <- usmap::us_map(include = 'MN') %>%
st_as_sf(coords = c('x', 'y'),
crs = usmap::usmap_crs()) %>%
st_transform(crs = 4326)
# Takes a few minutes to run:
mn_comids <- get_nhdplus(AOI = mn_sf, realization = "flowline")
plot(st_geometry(mn_comids), col='cornflowerblue')
plot(st_geometry(mn_sf), border = 'black', lwd=3, add=TRUE) |
I was following along with the suggestion in DOI-USGS/nhdplusTools#354 (comment) to Capturing what I have done so far by describing the paths I've taken and the dead-ends I've hit.
library(sf)
local_zip <- 'nhdplus_epasnapshot2022_mn_gpkg.zip'
unzip(local_zip, exdir = '.')
local_file <- 'nhdplus_epasnapshot2022_mn.gpkg'
mn_flines <- st_read(local_file, layer = 'nhdflowline_mn')
plot(st_zm(st_geometry(mn_flines)))
head(mn_flines)
library(sf)
local_file <- 'NHDPlusV21_MS_07_NHDSnapshotFGDB_08/NHDPlusMS/NHDPlus07/NHDSnapshot/NHDSnapshot.gdb'
huc07_flines <- st_read(local_file, layer = 'NHDFlowline')
plot(st_zm(st_geometry(huc07_flines)))
head(huc07_flines)
|
I asked ChatGPT about the 7z error I was getting and based on its reply, I tried unzipping the 7z to a new folder rather than excepting the default. That worked ................ I have spent all morning attempting to get passed this.
|
The new, faster way to get COMIDs for a large space (not querying web services each time) # Find MN COMIDs in a faster way!
library(geos)
library(nhdplusTools)
library(sf)
library(tidyverse)
# 1. Download National flowlines layer
nhdplus_gdb <- download_nhdplusv2("./6_PredictClass/tmp_nhdplus/")
# 2. Unzip flowlines GDB *to a new folder*
unzip('6_PredictClass/tmp_nhdplus/NHDPlusV21_NationalData_Seamless_Geodatabase_Lower48_07.7z',
exdir = '6_PredictClass/out_nhdplus')
# 3. Load the national layer and an sf object of MN
nat_gdb <- '6_PredictClass/out_nhdplus/NHDPlusNationalData/NHDPlusV21_National_Seamless_Flattened_Lower48.gdb'
nat_flines_sf <- st_read(nat_gdb, layer = 'NHDFlowline_Network') %>% st_zm() # Drops vertical attrs
mn_sf <- usmap::us_map(include = 'MN') %>%
st_as_sf(coords = c('x', 'y'),
crs = usmap::usmap_crs()) %>%
st_transform(crs = st_crs(nat_flines_sf))
# 4. Use geos to intersect flowlines with the state
mn_flines <- nat_flines_sf[which(geos::geos_intersects(nat_flines_sf, mn_sf)),]
# Now just plot for visualization purposes
ggplot() +
geom_sf(data = mn_sf, fill = 'white', color = 'black') +
geom_sf(data = mn_flines, color = 'cornflowerblue', size=1) +
theme_void() |
For MN, I need to query COMIDs upstream of each COMID individually. This takes ~ 1 second per COMID. Working on implementing this into our pipeline now in order to predict forward but for Minnesota alone this could mean 17 hrs of querying time ... (60,829 COMIDs / 3600 seconds/hr = 16.89 hrs) Trying to see if there is a different way to do this that is faster, what I really need is a catchment shape per COMID that shows all upstream COMIDs. Checking out some Note that you need to run the code chunk from the previous comment to run this: mn_upstream <- mn_comids %>%
rename(nhd_comid = comid) %>%
split(.$nhd_comid) %>%
map(~identify_upstream_comids(comid_in = .x$nhd_comid)) %>%
bind_rows()
mn_upstream_flowlines <- get_nhdplus(comid = unique(mn_upstream$nhd_comid_upstream), realization = "flowline")
plot(st_geometry(mn_upstream_flowlines), col='cornflowerblue')
plot(st_geometry(p6_state_sf[[1]]), border = 'black', lwd=3, add=TRUE) |
Start with MN to test some of the issues with expanding beyond our state borders. Then, go up to the full region.
The text was updated successfully, but these errors were encountered: