Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
2ba9073
first round, delete SpaceTimeRegion object and related classes/methods
j-atkins Dec 2, 2025
e9dbd78
remove space time region from _plan
j-atkins Dec 2, 2025
5a5e056
update base and instrument class logic to use waypoint extremes for s…
j-atkins Dec 2, 2025
b5eea49
clean up _plan methods
j-atkins Dec 2, 2025
ffe07fc
extend latlon buffer for drifters
j-atkins Dec 2, 2025
a5d692a
set time buffer from drifter lifetime config
j-atkins Dec 2, 2025
6ebb41b
shift testing to use waypoint space-times instead of space-time-regio…
j-atkins Dec 3, 2025
1f778ea
fix assert error messaging
j-atkins Dec 3, 2025
14b8833
remove spatial constraint on fieldset ingestion
j-atkins Dec 5, 2025
1affd50
Merge branch 'main' into delete-spacetime
j-atkins Dec 9, 2025
714b0bf
add control over whether to constrain spatial region in fieldset
j-atkins Dec 9, 2025
e820584
give argos a prescribed lifetime
j-atkins Dec 9, 2025
7369f46
add lifetime field to expedition models and _plan
j-atkins Dec 9, 2025
15848f3
update test with argo lifetime
j-atkins Dec 9, 2025
ba3793b
remove temporary debugging tools
j-atkins Dec 9, 2025
340d940
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 9, 2025
7eff191
update argo endtime and give depth lim in fieldset generation
j-atkins Dec 10, 2025
fd6a3f6
remove depth lim on fieldset generation
j-atkins Dec 10, 2025
edf8149
Merge branch 'main' into delete-spacetime
j-atkins Dec 19, 2025
b00c85c
clean up comment
j-atkins Dec 23, 2025
86fa3f6
Merge branch 'delete-spacetime' of github.com:OceanParcels/virtualshi…
j-atkins Dec 23, 2025
ff43127
change lifetime in expedition model to be in days units
j-atkins Dec 23, 2025
87de210
update _plan logic (+ more changes to expedition model) to handle lif…
j-atkins Dec 23, 2025
593e0f2
small refactor for readability
j-atkins Dec 23, 2025
6ae1317
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 23, 2025
2e8c14a
remove broken links
j-atkins Dec 19, 2025
4fd2d77
Merge branch 'delete-spacetime' of github.com:OceanParcels/virtualshi…
j-atkins Dec 23, 2025
984d441
update pre-downloaded data documentation to explain argo/drifter life…
j-atkins Dec 23, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 8 additions & 2 deletions docs/user-guide/assignments/Research_proposal_intro.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -150,9 +150,15 @@
"\n",
"Finally, fit your equipment in the 20-foot container. The container will be sent to your port of departure ahead of time with a cargo boat, so make sure you are packed in time for this transfer. Remember there are no shops at sea, so think carefully and plan ahead. \n",
"\n",
"![Equipment preparation NIOZ](https://www.nioz.nl/application/files/9116/7500/3457/2023-01-16-packing.jpg) \n",
"![Equipment loading](https://www.nioz.nl/application/files/7416/7810/2265/2023-03-06-container-shifting.jpg) "
"<!-- TODO: these images are no longer hosted -->\n",
"<!-- ![Equipment preparation NIOZ](https://www.nioz.nl/application/files/9116/7500/3457/2023-01-16-packing.jpg) \n",
"![Equipment loading](https://www.nioz.nl/application/files/7416/7810/2265/2023-03-06-container-shifting.jpg) -->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": []
}
],
"metadata": {
Expand Down
4 changes: 4 additions & 0 deletions docs/user-guide/documentation/pre_download_data.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,10 @@ In addition, all pre-downloaded data must be split into separate files per times
**Monthly data**: when using monthly data, ensure that your final .nc file download is for the month *after* your expedition schedule end date. This is to ensure that a Parcels FieldSet can be generated under-the-hood which fully covers the expedition period. For example, if your expedition runs from 1st May to 15th May, your final monthly data file should be in June. Daily data files only need to cover the expedition period exactly.
```

```{note}
**Argo and Drifter data**: if using Argo floats or Drifters in your expedition, ensure that: 1) the temporal extent of the downloaded data also accounts for the full *lifetime* of the instruments, not just the expedition period, and 2) the spatial bounds of the downloaded data also accounts for the likely drift distance of the instruments over their lifetimes. Otherwise, simulations will end prematurely (out-of-bounds errors) when the data runs out.
```

Further, VirtualShip expects pre-downloaded data to be organised in a specific directory & filename structure within the specified local data directory. The expected structure is as outlined in the subsequent sections.

#### Directory structure
Expand Down
283 changes: 25 additions & 258 deletions src/virtualship/cli/_plan.py

Large diffs are not rendered by default.

29 changes: 0 additions & 29 deletions src/virtualship/cli/validator_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,35 +41,6 @@ def is_valid_lon(value: str) -> bool:
return -180 < v < 360


@require_docstring
def is_valid_depth(value: str) -> bool:
"""Float."""
try:
v = float(value)
except ValueError:
return None

# NOTE: depth model in space_time_region.py ONLY specifies that depth must be float (and no conditions < 0)
# NOTE: therefore, this condition is carried forward here to match what currently exists
# NOTE: however, there is a TODO in space_time_region.py to add conditions as Pydantic Field
# TODO: update validator here if/when depth model is updated in space_time_region.py
return isinstance(v, float)


@require_docstring
def is_valid_timestr(value: str) -> bool:
"""Format YYYY-MM-DD hh:mm:ss."""
if (
not value.strip()
): # return as valid if blank, UI logic will auto fill on save if so
return True
try:
datetime.datetime.strptime(value, "%Y-%m-%d %H:%M:%S")
return True
except Exception:
return False


# SHIP CONFIG INPUTS VALIDATION

FIELD_CONSTRAINT_ATTRS = (
Expand Down
5 changes: 4 additions & 1 deletion src/virtualship/instruments/adcp.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,9 @@ class ADCPInstrument(Instrument):
def __init__(self, expedition, from_data):
"""Initialize ADCPInstrument."""
variables = {"U": "uo", "V": "vo"}
limit_spec = {
"spatial": True
} # spatial limits; lat/lon constrained to waypoint locations + buffer

super().__init__(
expedition,
Expand All @@ -65,7 +68,7 @@ def __init__(self, expedition, from_data):
allow_time_extrapolation=True,
verbose_progress=False,
spacetime_buffer_size=None,
limit_spec=None,
limit_spec=limit_spec,
from_data=from_data,
)

Expand Down
22 changes: 9 additions & 13 deletions src/virtualship/instruments/argo_float.py
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,11 @@ def __init__(self, expedition, from_data):
variables = {"U": "uo", "V": "vo", "S": "so", "T": "thetao"}
spacetime_buffer_size = {
"latlon": 3.0, # [degrees]
"time": 21.0, # [days]
"time": expedition.instruments_config.argo_float_config.lifetime.total_seconds()
/ (24 * 3600), # [days]
}
limit_spec = {
"spatial": True, # spatial limits; lat/lon constrained to waypoint locations + buffer
}

super().__init__(
Expand All @@ -177,15 +181,14 @@ def __init__(self, expedition, from_data):
allow_time_extrapolation=False,
verbose_progress=True,
spacetime_buffer_size=spacetime_buffer_size,
limit_spec=None,
limit_spec=limit_spec,
from_data=from_data,
)

def simulate(self, measurements, out_path) -> None:
"""Simulate Argo float measurements."""
DT = 10.0 # dt of Argo float simulation integrator
OUTPUT_DT = timedelta(minutes=5)
ENDTIME = None

if len(measurements) == 0:
print(
Expand Down Expand Up @@ -235,15 +238,8 @@ def simulate(self, measurements, out_path) -> None:
chunks=[len(argo_float_particleset), 100],
)

# get earliest between fieldset end time and provide end time
fieldset_endtime = fieldset.time_origin.fulltime(fieldset.U.grid.time_full[-1])
if ENDTIME is None:
actual_endtime = fieldset_endtime
elif ENDTIME > fieldset_endtime:
print("WARN: Requested end time later than fieldset end time.")
actual_endtime = fieldset_endtime
else:
actual_endtime = np.timedelta64(ENDTIME)
# endtime
endtime = fieldset.time_origin.fulltime(fieldset.U.grid.time_full[-1])

# execute simulation
argo_float_particleset.execute(
Expand All @@ -253,7 +249,7 @@ def simulate(self, measurements, out_path) -> None:
_keep_at_surface,
_check_error,
],
endtime=actual_endtime,
endtime=endtime,
dt=DT,
output_file=out_file,
verbose_progress=self.verbose_progress,
Expand Down
75 changes: 46 additions & 29 deletions src/virtualship/instruments/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
import abc
from collections import OrderedDict
from datetime import timedelta
from itertools import pairwise
from pathlib import Path
from typing import TYPE_CHECKING

Expand All @@ -17,6 +18,7 @@
_find_files_in_timerange,
_find_nc_file_with_variable,
_get_bathy_data,
_get_waypoint_latlons,
_select_product_id,
ship_spinner,
)
Expand Down Expand Up @@ -56,6 +58,19 @@ def __init__(
self.spacetime_buffer_size = spacetime_buffer_size
self.limit_spec = limit_spec

wp_lats, wp_lons = _get_waypoint_latlons(expedition.schedule.waypoints)
wp_times = [
wp.time for wp in expedition.schedule.waypoints if wp.time is not None
]
assert all(earlier <= later for earlier, later in pairwise(wp_times)), (
"Waypoint times are not in ascending order"
)
self.wp_times = wp_times

self.min_time, self.max_time = wp_times[0], wp_times[-1]
self.min_lat, self.max_lat = min(wp_lats), max(wp_lats)
self.min_lon, self.max_lon = min(wp_lons), max(wp_lons)

def load_input_data(self) -> FieldSet:
"""Load and return the input data as a FieldSet for the instrument."""
try:
Expand All @@ -76,10 +91,10 @@ def load_input_data(self) -> FieldSet:
# bathymetry data
if self.add_bathymetry:
bathymetry_field = _get_bathy_data(
self.expedition.schedule.space_time_region,
latlon_buffer=self.spacetime_buffer_size.get("latlon")
if self.spacetime_buffer_size
else None,
self.min_lat,
self.max_lat,
self.min_lon,
self.max_lon,
from_data=self.from_data,
).bathymetry
bathymetry_field.data = -bathymetry_field.data
Expand Down Expand Up @@ -115,36 +130,39 @@ def execute(self, measurements: list, out_path: str | Path) -> None:

def _get_copernicus_ds(
self,
time_buffer: float | None,
physical: bool,
var: str,
) -> xr.Dataset:
"""Get Copernicus Marine dataset for direct ingestion."""
product_id = _select_product_id(
physical=physical,
schedule_start=self.expedition.schedule.space_time_region.time_range.start_time,
schedule_end=self.expedition.schedule.space_time_region.time_range.end_time,
schedule_start=self.min_time,
schedule_end=self.max_time,
variable=var if not physical else None,
)

latlon_buffer = self._get_spec_value("buffer", "latlon", 0.0)
time_buffer = self._get_spec_value("buffer", "time", 0.0)
latlon_buffer = self._get_spec_value(
"buffer", "latlon", 0.25
) # [degrees]; default 0.25 deg buffer to ensure coverage in field cell edge cases
depth_min = self._get_spec_value("limit", "depth_min", None)
depth_max = self._get_spec_value("limit", "depth_max", None)
spatial_constraint = self._get_spec_value("limit", "spatial", True)

min_lon_bound = self.min_lon - latlon_buffer if spatial_constraint else None
max_lon_bound = self.max_lon + latlon_buffer if spatial_constraint else None
min_lat_bound = self.min_lat - latlon_buffer if spatial_constraint else None
max_lat_bound = self.max_lat + latlon_buffer if spatial_constraint else None

return copernicusmarine.open_dataset(
dataset_id=product_id,
minimum_longitude=self.expedition.schedule.space_time_region.spatial_range.minimum_longitude
- latlon_buffer,
maximum_longitude=self.expedition.schedule.space_time_region.spatial_range.maximum_longitude
+ latlon_buffer,
minimum_latitude=self.expedition.schedule.space_time_region.spatial_range.minimum_latitude
- latlon_buffer,
maximum_latitude=self.expedition.schedule.space_time_region.spatial_range.maximum_latitude
+ latlon_buffer,
minimum_longitude=min_lon_bound,
maximum_longitude=max_lon_bound,
minimum_latitude=min_lat_bound,
maximum_latitude=max_lat_bound,
variables=[var],
start_datetime=self.expedition.schedule.space_time_region.time_range.start_time,
end_datetime=self.expedition.schedule.space_time_region.time_range.end_time
+ timedelta(days=time_buffer),
start_datetime=self.min_time,
end_datetime=self.max_time + timedelta(days=time_buffer),
minimum_depth=depth_min,
maximum_depth=depth_max,
coordinates_selection_method="outside",
Expand All @@ -159,6 +177,8 @@ def _generate_fieldset(self) -> FieldSet:
fieldsets_list = []
keys = list(self.variables.keys())

time_buffer = self._get_spec_value("buffer", "time", 0.0)

for key in keys:
var = self.variables[key]
if self.from_data is not None: # load from local data
Expand All @@ -168,17 +188,10 @@ def _generate_fieldset(self) -> FieldSet:
else:
data_dir = self.from_data.joinpath("bgc")

schedule_start = (
self.expedition.schedule.space_time_region.time_range.start_time
)
schedule_end = (
self.expedition.schedule.space_time_region.time_range.end_time
)

files = _find_files_in_timerange(
data_dir,
schedule_start,
schedule_end,
self.min_time,
self.max_time + timedelta(days=time_buffer),
)

_, full_var_name = _find_nc_file_with_variable(
Expand All @@ -197,7 +210,11 @@ def _generate_fieldset(self) -> FieldSet:
)
else: # stream via Copernicus Marine Service
physical = var in COPERNICUSMARINE_PHYS_VARIABLES
ds = self._get_copernicus_ds(physical=physical, var=var)
ds = self._get_copernicus_ds(
time_buffer,
physical=physical,
var=var,
)
fs = FieldSet.from_xarray_dataset(
ds, {key: var}, self.dimensions, mesh="spherical"
)
Expand Down
5 changes: 4 additions & 1 deletion src/virtualship/instruments/ctd.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,9 @@ class CTDInstrument(Instrument):
def __init__(self, expedition, from_data):
"""Initialize CTDInstrument."""
variables = {"S": "so", "T": "thetao"}
limit_spec = {
"spatial": True
} # spatial limits; lat/lon constrained to waypoint locations + buffer

super().__init__(
expedition,
Expand All @@ -90,7 +93,7 @@ def __init__(self, expedition, from_data):
allow_time_extrapolation=True,
verbose_progress=False,
spacetime_buffer_size=None,
limit_spec=None,
limit_spec=limit_spec,
from_data=from_data,
)

Expand Down
6 changes: 5 additions & 1 deletion src/virtualship/instruments/ctd_bgc.py
Original file line number Diff line number Diff line change
Expand Up @@ -112,14 +112,18 @@ def __init__(self, expedition, from_data):
"phyc": "phyc",
"nppv": "nppv",
}
limit_spec = {
"spatial": True
} # spatial limits; lat/lon constrained to waypoint locations + buffer

super().__init__(
expedition,
variables,
add_bathymetry=True,
allow_time_extrapolation=True,
verbose_progress=False,
spacetime_buffer_size=None,
limit_spec=None,
limit_spec=limit_spec,
from_data=from_data,
)

Expand Down
28 changes: 7 additions & 21 deletions src/virtualship/instruments/drifter.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,10 +67,12 @@ def __init__(self, expedition, from_data):
"""Initialize DrifterInstrument."""
variables = {"U": "uo", "V": "vo", "T": "thetao"}
spacetime_buffer_size = {
"latlon": 6.0, # [degrees]
"time": 21.0, # [days]
"latlon": None,
"time": expedition.instruments_config.drifter_config.lifetime.total_seconds()
/ (24 * 3600), # [days]
}
limit_spec = {
"spatial": False, # no spatial limits; generate global fieldset
"depth_min": 1.0, # [meters]
"depth_max": 1.0, # [meters]
}
Expand All @@ -90,7 +92,6 @@ def simulate(self, measurements, out_path) -> None:
"""Simulate Drifter measurements."""
OUTPUT_DT = timedelta(hours=5)
DT = timedelta(minutes=5)
ENDTIME = None

if len(measurements) == 0:
print(
Expand Down Expand Up @@ -132,29 +133,14 @@ def simulate(self, measurements, out_path) -> None:
chunks=[len(drifter_particleset), 100],
)

# get earliest between fieldset end time and prescribed end time
fieldset_endtime = fieldset.time_origin.fulltime(fieldset.U.grid.time_full[-1])
if ENDTIME is None:
actual_endtime = fieldset_endtime
elif ENDTIME > fieldset_endtime:
print("WARN: Requested end time later than fieldset end time.")
actual_endtime = fieldset_endtime
else:
actual_endtime = np.timedelta64(ENDTIME)
# determine end time for simulation, from fieldset (which itself is controlled by drifter lifetimes)
endtime = fieldset.time_origin.fulltime(fieldset.U.grid.time_full[-1])

# execute simulation
drifter_particleset.execute(
[AdvectionRK4, _sample_temperature, _check_lifetime],
endtime=actual_endtime,
endtime=endtime,
dt=DT,
output_file=out_file,
verbose_progress=self.verbose_progress,
)

# if there are more particles left than the number of drifters with an indefinite endtime, warn the user
if len(drifter_particleset.particledata) > len(
[d for d in measurements if d.lifetime is None]
):
print(
"WARN: Some drifters had a life time beyond the end time of the fieldset or the requested end time."
)
Loading