Hydrography->NHDPlus High Resolution (NHDPlus HR)
Digital Evevation map (spatial resolution of 30m)
Interval - 5yrs, Duration- 15mins; Missing a segment of North-Western USA
We use EVT-140 CONUS (2014) data.
STATSGO Database.
NHD-shapefiles were too large in size and number to support fast visualization. We simplified these shapes to enable fast fetching. The shape_simplification_ogr directory contains scripts for simplification of shapefiles.
- Run simplification_nhd_script.sh to simplify the catchment shapes.
- Run update_grid_script.sh to modify the GridCode inside the generated jsons to make them unique.
For simplification of the STATSGO shapes, just run simplification_statsgo_script.sh
NOTE TO SELF: Run simplification on lattice-77 and copy and ingest from lattice-101.
Run the following from the directory containing DEM TIFFs:
ls -n | %{py PostFireDebrisRiskAnalysis\arcpy_processing\dem_processing\dem_huc_zstats_w_intrc_chk.py $_}
Differential Normalized Burn Ratio.
- Import the file EVT-140_CONUS_MAIN\US_140EVT_20180618\Grid\us_140evt into ArcMap.
- Open Attribute Table and add a field DNBR. RClick and select Field Calculator.
- Use the python code \vegetation\pre_logic.py as:
DNBR = get_lambda(!Value_1!)
- Extract the DNBR field into a single-band TIFF file using delete raster attribute table function.
Run the following to perform Zonal Statistics over NHD catchments:
cd C:\Users\sapmitra\Documents\PostFireDebris\data\CatchmentBoundaries\temp_shapefiles
ls -n | %{py PostFireDebrisRiskAnalysis\arcpy_processing\vegetation_processing\dnbr_zonal_stats_single.py $_}
Run the following from the directory containing DEM TIFFs:
cd C:\Users\sapmitra\Documents\PostFireDebris\data\SOILS
ls -n | findstr "shp"| %{py C:\Users\sapmitra\PycharmProjects\FireWatcher\data_proc\soils\soils_all_hucs_zonal_stats_single.py $_}
Run the following from the directory containing Design Storm .asc files:
cd C:\Users\sapmitra\Documents\PostFireDebris\data\DesignStorm
ls -n | %{py PostFireDebrisRiskAnalysis\arcpy_processing\design_storm_processing\dstorm_huc_zonal_intrc_chk_single.py $_}
Use files PostFireDebrisRiskAnalysis\combine_boundaries\combine*.py to combine the various X1, X2, X3 and Design Storms for overlapping/duplicate catchment boundaries.
Run combine_boundaries/compute_thi15.py on lattice machines to compute THi15. DNBR (X2) has to be scaled by dividing by 1000.
Run PostFireDebrisRiskAnalysis\thi15\compare_thi15_DS.py
db.createCollection("nhd_shapes")
db.nhd_shapes.createIndex({geometry : "2dsphere"})
db.X1_Elevation.createIndex({"GridCode": 1},{unique: true});
The PostFireDebrisRiskAnalysis\shape_simplification_ogr\update_grid_script.sh script excludes/includes any shape-files that were missed in the previous run due to incompatibility with mongodb and also updates the GridCode and copies it to the upper level of the json file.
Following this, data Ingestion is done using the script PostFireDebrisRiskAnalysis\ingestion_with_check\insert_script_nhd.sh that logs the NHD shapes that were incompatible into the console.
Computed Attributes relating to each NHD catchment is appended to the corresponding document in the nhd_shapes collection using the merge command as below:
mongoimport --port 27018 --db sustaindb --collection nhd_shapes --mode merge --upsertFields GridCode --headerline --type csv --file thi15.csv