-
Notifications
You must be signed in to change notification settings - Fork 21
Description
Right now we rely on a sort of crude and uncontrolled method to smooth bathymetry, eg from the bathymetry generation example
h_rough = regrid_bathymetry(grid; interpolation_passes = 3)
h_smooth = regrid_bathymetry(grid; interpolation_passes = 40)It doesn't really seem possible to predict what the effect of the "interpolation passes" is. Obviously more of them smooths, but how much do they smooth? How much do we need? Is there any way to understand what the net outcome is, regardless of the grid we are on?
I think this method was implemented not because it is a good method to use, but because it was convenient to simply "keep interpolating". Convenience isn't always the best motivation...
I think diffusion or a moving average might be a better way to smooth bathymetry. This would preserve the total ocean volume. Also, rather than specifying "number of passes" I feel like it would be more useful to be able to specify something like the "maximum gradient" or "maximum curvature" and iterate until that criterion is reached.