diff --git a/docs/source/examples/custom_saving.rst b/docs/source/examples/custom_saving.rst index 12ea2219..4be2b3e0 100644 --- a/docs/source/examples/custom_saving.rst +++ b/docs/source/examples/custom_saving.rst @@ -17,11 +17,11 @@ For example, ``self._save_fig_list['active_layer'] = ['active_layer']`` will pro When adding variables or metadata to be initialized and subsequently saved in the output netCDF, the key-value pair relationship is as follows. The key added to ``self._save_var_list`` is the name of the variable as it will be recorded in the netCDF file, this *does not* have to correspond to the name of an attribute in the model. To add a variable to the metadata, a key must be added to ``self._save_var_list['meta']``. -The expected value for a given key is a list containing strings indicating the model attribute to be saved, its units, the variable type, and lastly the variable dimensions (e.g., ``['active_layer', 'fraction', 'f4', ('time', 'x', 'y')]`` for the active layer). +The expected value for a given key is a list containing strings indicating the model attribute to be saved, its units, the variable type, and lastly the variable dimensions (e.g., ``['active_layer', 'fraction', 'f4', ('seconds', 'x', 'y')]`` for the active layer). .. important:: - The dimensions of the custom variable being specified must match *exactly* with one of the three standard dimensions: `x`, `y`, `time`. + The dimensions of the custom variable being specified must match *exactly* with one of the three standard dimensions: `x`, `y`, `seconds`. Use of an invalid dimension will result in an error. An example of using the hook and creating a model subclass to customize the figures, gridded variables, and metadata being saved is provided below. @@ -48,14 +48,19 @@ An example of using the hook and creating a model subclass to customize the figu ... self._save_fig_list['active_layer'] = ['active_layer'] ... ... # save the active layer grid each save_dt w/ a short name - ... self._save_var_list['actlay'] = ['active_layer', 'fraction', - ... 'f4', ('time', - ... 'x', 'y')] + ... self._save_var_list['actlay'] = [ + ... 'active_layer', 'fraction', + ... 'f4', ('seconds', 'x', 'y'), + ... 'channel_bottom__sediment__active_layer' + ... ] ... ... # save number of water parcels w/ a long name - ... self._save_var_list['meta']['water_parcels'] = ['Np_water', - ... 'parcels', - ... 'i8', ()] + ... self._save_var_list['meta']['water_parcels'] = [ + ... 'Np_water', + ... 'parcels', + ... 'i8', (), + ... 'model_water__number_parcels' + ... ] Next, we instantiate the model class. @@ -83,4 +88,4 @@ For simplicity we will just check that the appropriate parameters were added to {'active_layer': ['active_layer']} >>> print(mdl._save_var_list) - {'meta': {'water_parcels': ['Np_water', 'parcels', 'i8', ()]}, 'actlay': ['active_layer', 'fraction', 'f4', ('time', 'x', 'y')]} + {'meta': {'water_parcels': ['Np_water', 'parcels', 'i8', (), 'model_water__number_parcels']}, 'actlay': ['active_layer', 'fraction', 'f4', ('seconds', 'x', 'y'), 'channel_bottom__sediment__active_layer']} diff --git a/docs/source/info/outputfile.rst b/docs/source/info/outputfile.rst index 3a30b8e6..30252fd5 100644 --- a/docs/source/info/outputfile.rst +++ b/docs/source/info/outputfile.rst @@ -10,30 +10,35 @@ Gridded Variables In any given run, the saving parameters "save__grids" control whether or not that 2-D grid variable (e.g. velocity) is saved to the netCDF4 file. In -the netCDF4 file, a 3-D array with the dimensions `time` :math:`\times` +the netCDF4 file, a 3-D array with the dimensions `seconds` :math:`\times` `x` :math:`\times` `y` is created for each 2-D grid variable that is set to be saved. Note that `x` is the *downstream* coordinate, rather than the Cartesian `x` when displaying the grid. The appropriate units for all variables are stored: for example "meters per second" for the *velocity* -grid. +grid. All variables include a description via the `long_name` attribute. -.. note:: +.. important:: - The format of the output netCDF file coordinate changed in `v2.1.0`. The - old format is documented - in :attr:`~pyDeltaRCM.model.DeltaModel.legacy_netcdf`, and that input - parameter `legacy_netcdf` can be used to create on output netcdf file with - the old coordinate configuration. + The format of the output netCDF file coordinates changed in `v2.2.0`. The + old format (up to v2.1.9) is documented + in :attr:`~pyDeltaRCM.model.DeltaModel.legacy_netcdf`, and the input + parameter `legacy_netcdf` can be used to create an output netcdf file with + the old coordinate configuration. The output format for pyDeltaRCM v2.1.0 + and earlier is deprecated and has been removed. Changing to the new output + format should be a small change for most users, with only the name of the + temporal dimension and the metadata subgroup changing in the new output. Grid Coordinates ================ -Grid coordinates are specified in the variables `time`, `x`, and `y` in the output netCDF4 file. -These arrays are 1D arrays, which specify the location of each cell in the domain in *dimensional* coordinates (e.g., meters). -In the downstream direction, the distance of each cell from the inlet boundary is specified in `x` in meters. -Similarly, the cross-domain distance is specified in `y` in meters. -Lastly, the `time` variable is stored as a 1D array with model `time` in seconds. +Grid coordinates are specified in the variables `seconds`, `x`, and `y` in the +output netCDF4 file. These arrays are 1D arrays, which specify the location +of each cell in the domain in *dimensional* coordinates (e.g., meters). In +the downstream direction, the distance of each cell from the inlet boundary +is specified in `x` in meters. Similarly, the cross-domain distance is +specified in `y` in meters. Lastly, the `seconds` variable is stored as a 1D +array recording model elapsed time in seconds. Model Metadata @@ -53,6 +58,7 @@ saved as metadata are the following: - Sediment concentration: `C0_percent` - Characteristic Velocity: `u0` - If subsidence is enabled: + - Subsidence start time: `start_subsidence` - Subsidence rate: `sigma` @@ -66,7 +72,7 @@ library. These libraries range from the to higher-level libraries such as `xarray `_. For deltas, and specifically *pyDeltaRCM*, there is also a package under development called -`DeltaMetrics `_, +`sandplover `_, that is being designed to help post-process and analyze *pyDeltaRCM* outputs. diff --git a/pyDeltaRCM/_version.py b/pyDeltaRCM/_version.py index 86aff2ec..8e129ede 100644 --- a/pyDeltaRCM/_version.py +++ b/pyDeltaRCM/_version.py @@ -3,4 +3,8 @@ def __version__() -> str: Private version declaration, gets assigned to pyDeltaRCM.__version__ during import """ +<<<<<<< sandsuet_output + return "2.2.0" +======= return "2.1.10" +>>>>>>> develop diff --git a/pyDeltaRCM/hook_tools.py b/pyDeltaRCM/hook_tools.py index 2ac68b22..b9321aab 100644 --- a/pyDeltaRCM/hook_tools.py +++ b/pyDeltaRCM/hook_tools.py @@ -237,7 +237,7 @@ def hook_init_output_file(self) -> None: .. note:: For a vector of time-varying metadata, the dimension - should be specified as ('total_time'). + should be specified as ('time'). Expected format for time varying grid entries as keys within the `self._save_var_list` dictionary: diff --git a/pyDeltaRCM/init_tools.py b/pyDeltaRCM/init_tools.py index ce11dddf..21df763c 100644 --- a/pyDeltaRCM/init_tools.py +++ b/pyDeltaRCM/init_tools.py @@ -225,9 +225,9 @@ def import_files(self, kwargs_dict={}) -> None: self.out_dir = self._input_file_vars["out_dir"] self.verbose = self._input_file_vars["verbose"] if self._input_file_vars["legacy_netcdf"]: - self._netcdf_coords = ("total_time", "length", "width") - else: self._netcdf_coords = ("time", "x", "y") + else: + self._netcdf_coords = ("seconds", "x", "y") def process_input_to_model(self) -> None: """Process input file to model variables. @@ -659,6 +659,22 @@ def init_output_file(self) -> None: file_path = os.path.join(directory, filename) _msg = "Target output NetCDF4 file: {file}".format(file=file_path) self.log_info(_msg, verbosity=2) + if not self._legacy_netcdf: + _sandsuet_version = "1.0.0" + _msg = "Output file in sandsuet version {ver} schema".format( + ver=_sandsuet_version + ) + else: + _msg = "Output file in legacy schema" + warnings.warn( + "Creating output netcdf file in legacy schema. This format is " + "provided as a convenience for users who are currently " + "relying on workflows that use an old format of netcdf file. " + "It will be removed in the future. Any new workflows should " + "leverage the sandsuet formatted data specification " + "(i.e., `legacy_netcdf=False`)." + ) + self.log_info(_msg, verbosity=1) if (os.path.exists(file_path)) and (self._clobber_netcdf is False): raise FileExistsError( @@ -678,6 +694,8 @@ def init_output_file(self) -> None: self.output_netcdf.source = "pyDeltaRCM v{ver}".format( ver=self.__pyDeltaRCM_version__ ) + if not self._legacy_netcdf: + self.output_netcdf.sandsuet_version = _sandsuet_version # create master dimensions (pulls from `self._netcdf_coords`) self.output_netcdf.createDimension(self._netcdf_coords[1], self.L) @@ -685,77 +703,158 @@ def init_output_file(self) -> None: self.output_netcdf.createDimension(self._netcdf_coords[0], None) # create master coordinates (as netCDF variables) - time = self.output_netcdf.createVariable( - "time", "f4", (self._netcdf_coords[0],) - ) - time.units = "second" - if self._legacy_netcdf: - # old format is 2d array x and y - x = self.output_netcdf.createVariable( - "x", "f4", self._netcdf_coords[1:] + time = self.output_netcdf.createVariable( + "time", "f4", (self._netcdf_coords[0],) ) - y = self.output_netcdf.createVariable( - "y", "f4", self._netcdf_coords[1:] - ) - x[:] = self.x - y[:] = self.y else: - # new output format is 1d x and y - x = self.output_netcdf.createVariable("x", "f4", ("x")) - y = self.output_netcdf.createVariable("y", "f4", ("y")) - x[:] = self.xc - y[:] = self.yc + time = self.output_netcdf.createVariable( + "seconds", "f4", (self._netcdf_coords[0],) + ) + time.units = "second" + # new output format is 1d x and y + x = self.output_netcdf.createVariable("x", "f4", ("x")) + y = self.output_netcdf.createVariable("y", "f4", ("y")) + x[:] = self.xc + y[:] = self.yc x.units = "meter" y.units = "meter" - # set up variables for output data grids - def _create_grid_variable(varname, varunits, vartype="f4", vardims=()): + # set up function to output data grids + def _create_grid_variable( + varname, varunits, vartype="f4", vardims=(), varlong=None + ): _v = self.output_netcdf.createVariable(varname, vartype, vardims) _v.units = varunits + if varlong is not None: + # long_name is provided, record it + _v.long_name = varlong + else: + if not self._legacy_netcdf: + raise ValueError( + f"long name must be provided for all variables to create a " + f"sandsuet compliant data output, " + f"but was not provided for variable '{varname}'." + ) + # loop through main output data grids _var_list = list(self._save_var_list.keys()) - _var_list.remove("meta") + _var_list.remove("meta") # remove group from list for _val in _var_list: + if isinstance(self._save_var_list[_val], list): + ### for now, we silently convert to dictionary format + # # inputs should be specified as a dictionary + # warnings.warn( + # f"Specification format for output data should be `dict`, " + # f"but was `list`. Converting `list` for one or more variables " + # f"to `dict` based on item order. This compatability will " + # f"be removed in a future version." + # ) + ### + # do the conversion + __inlist = self._save_var_list[_val] + __varname = _val + __varlong = __inlist[4] if len(__inlist) > 4 else None + _vardict = dict( + varname=__varname, # use dict key as varname + varunits=__inlist[1], + vartype=__inlist[2], + vardims=__inlist[3], + varlong=__varlong, + ) + else: + _vardict = self._save_var_list[_val] + _create_grid_variable( - _val, - self._save_var_list[_val][1], - self._save_var_list[_val][2], - self._save_var_list[_val][3], + varname=_vardict["varname"], + varunits=_vardict["varunits"], + vartype=_vardict["vartype"], + vardims=_vardict["vardims"], + varlong=_vardict["varlong"], ) - # set up metadata group and populate variables + # find name for subgroup data and make list + if self._legacy_netcdf: + self._subgroup_name = "meta" + else: + self._subgroup_name = "auxdata" + self.output_netcdf.createGroup(self._subgroup_name) + + # set up function to output additional data in subgroup def _create_meta_variable( - varname, varvalue, varunits, vartype="f4", vardims=() + varname, varvalue, varunits, vartype="f4", vardims=(), varlong=None ): _v = self.output_netcdf.createVariable( - "meta/" + varname, vartype, vardims + f"{self._subgroup_name}/" + varname, vartype, vardims ) _v.units = varunits + # convert string to value from model attrs + if isinstance(varvalue, str): + varvalue = getattr(self, varvalue) + # fill variable _v[:] = varvalue + if varlong is not None: + # long_name is provided, record it + _v.long_name = varlong + else: + if not self._legacy_netcdf: + raise ValueError( + f"long name must be provided for all variables to create a " + f"sandsuet compliant data output, " + f"but was not provided for variable '{varname}'." + ) - self.output_netcdf.createGroup("meta") + # loop through additional data in subgroup for _val in self._save_var_list["meta"].keys(): - # time-varying initialize w/ None value - if self._save_var_list["meta"][_val][0] is None: - _create_meta_variable( - _val, - self._save_var_list["meta"][_val][0], - self._save_var_list["meta"][_val][1], - self._save_var_list["meta"][_val][2], - self._save_var_list["meta"][_val][3], + if isinstance(self._save_var_list["meta"][_val], list): + ### for now, we silently convert to dictionary format + # inputs should be specified as a dictionary + # warnings.warn( + # f"Specification format for output subgroup data should be `dict`, " + # f"but was `list`. Converting `list` for one or more subgroup variables " + # f"to `dict` based on item order. This compatability will " + # f"be removed in a future version." + # ) + ### + # do the conversion + __inlist = self._save_var_list["meta"][_val] + __varname = _val + if __inlist[0] is None: + warnings.warn( + UserWarning( + "Specifying `None` for time varying dimensions " + "of model outputs will soon be deprecated. " + "Change to specifying the name of the " + "variable to save a string, and/or convert to " + "dictionary inputs." + ) + ) + __varvalue = None + else: + __varvalue = getattr(self, __inlist[0]) + __varlong = __inlist[4] if len(__inlist) > 4 else None + _vardict = dict( + varname=__varname, + varvalue=__varvalue, + varunits=__inlist[1], + vartype=__inlist[2], + vardims=__inlist[3], + varlong=__varlong, ) - # for scalars, get the attribute and store it + else: - _create_meta_variable( - _val, - getattr(self, self._save_var_list["meta"][_val][0]), - self._save_var_list["meta"][_val][1], - self._save_var_list["meta"][_val][2], - self._save_var_list["meta"][_val][3], - ) + _vardict = self._save_var_list["meta"][_val] + + _create_meta_variable( + varname=_vardict["varname"], + varvalue=_vardict["varvalue"], + varunits=_vardict["varunits"], + vartype=_vardict["vartype"], + vardims=_vardict["vardims"], + varlong=_vardict["varlong"], + ) _msg = "Output netCDF file created" self.log_info(_msg, verbosity=2) @@ -788,17 +887,54 @@ def init_metadata_list(self) -> None: Sets up the dictionary object for the standard metadata. """ # fixed metadata - self._save_var_list["meta"]["L0"] = ["L0", "cells", "i8", ()] - self._save_var_list["meta"]["N0"] = ["N0", "cells", "i8", ()] - self._save_var_list["meta"]["CTR"] = ["CTR", "cells", "i8", ()] - self._save_var_list["meta"]["dx"] = ["dx", "meters", "f4", ()] - self._save_var_list["meta"]["h0"] = ["h0", "meters", "f4", ()] - self._save_var_list["meta"]["hb"] = ["hb", "meters", "f4", ()] + self._save_var_list["meta"]["L0"] = [ + "L0", + "cells", + "i8", + (), + "channel_entrance__length", + ] + self._save_var_list["meta"]["N0"] = [ + "N0", + "cells", + "i8", + (), + "channel_entrance__width", + ] + self._save_var_list["meta"]["CTR"] = [ + "CTR", + "cells", + "i8", + (), + "channel_entrance__y_position", + ] + self._save_var_list["meta"]["dx"] = [ + "dx", + "meters", + "f4", + (), + "model_grid_cell_edge__length", + ] + self._save_var_list["meta"]["h0"] = [ + "h0", + "meters", + "f4", + (), + "channel_entrance__depth", + ] + self._save_var_list["meta"]["hb"] = [ + "hb", + "meters", + "f4", + (), + "basin_bottom_initial__depth", + ] self._save_var_list["meta"]["cell_type"] = [ "cell_type", "type", "i8", self._netcdf_coords[1:], + "model_grid_cell__type", ] # subsidence metadata if self._toggle_subsidence: @@ -807,37 +943,43 @@ def init_metadata_list(self) -> None: "seconds", "i8", (), + "basin_bottom_vertical_rate_of_change__start_time", ] self._save_var_list["meta"]["sigma"] = [ "sigma", "meters per timestep", "f4", self._netcdf_coords[1:], + "basin_bottom__vertical_rate_of_change", ] # time-varying metadata self._save_var_list["meta"]["H_SL"] = [ - None, + "H_SL", "meters", "f4", (self._netcdf_coords[0]), + "basin_water_surface__elevation", ] self._save_var_list["meta"]["f_bedload"] = [ - None, + "f_bedload", "fraction", "f4", (self._netcdf_coords[0]), + "channel_entrance_water_sediment_sand__volume_fraction", ] self._save_var_list["meta"]["C0_percent"] = [ - None, + "C0_percent", "percent", "f4", (self._netcdf_coords[0]), + "channel_entrance__water_sediment__volume_percent", ] self._save_var_list["meta"]["u0"] = [ - None, + "u0", "meters per second", "f4", (self._netcdf_coords[0]), + "channel_entrance__speed", ] def _load_past_etas(self, checkpoint): @@ -1030,6 +1172,14 @@ def load_checkpoint(self, defer_output: bool = False) -> None: # set object attribute for model self.output_netcdf = Dataset(file_path, "r+", format="NETCDF4") + # find subgroup name, supporting legacy file format + if "meta" in self.output_netcdf.groups.keys(): + self._subgroup_name = "meta" + elif "auxdata" in self.output_netcdf.groups.keys(): + self._subgroup_name = "auxdata" + else: + self._subgroup_name = self.output_netcdf.groups.keys()[0] + # synch netcdf file self.output_netcdf.sync() diff --git a/pyDeltaRCM/iteration_tools.py b/pyDeltaRCM/iteration_tools.py index 8cab5897..3ba7d605 100644 --- a/pyDeltaRCM/iteration_tools.py +++ b/pyDeltaRCM/iteration_tools.py @@ -254,7 +254,10 @@ def save_grids_and_figs(self) -> None: self.log_info(_msg, verbosity=1) if self._save_metadata or self._save_any_grids: - self.output_netcdf.variables["time"][save_idx] = self._time + if self._legacy_netcdf: + self.output_netcdf.variables["time"][save_idx] = self._time + else: + self.output_netcdf.variables["seconds"][save_idx] = self._time # ------------------ Figures ------------------ if len(self._save_fig_list) > 0: @@ -302,8 +305,18 @@ def save_grids_and_figs(self) -> None: _var_list = list(self._save_var_list.keys()) _var_list.remove("meta") for _val in _var_list: + # get the inital value from either list or dict + if isinstance(self._save_var_list[_val], list): + _modelvar = self._save_var_list[_val][0] + _ncvar = _val + else: + _modelvar = self._save_var_list[_val]["varvalue"] + _ncvar = self._save_var_list[_val]["varname"] + self.save_grids( - _val, getattr(self, self._save_var_list[_val][0]), save_idx + var_name=_ncvar, + var=getattr(self, _modelvar), + save_idx=save_idx, ) # ------------------ metadata ------------------ @@ -312,9 +325,27 @@ def save_grids_and_figs(self) -> None: self.log_info(_msg, verbosity=2) for _val in self._save_var_list["meta"].keys(): + # get the values from either list or dict + if isinstance(self._save_var_list["meta"][_val], list): + _dims = len(self._save_var_list["meta"][_val][3]) + _modelvar = self._save_var_list["meta"][_val][0] + _ncvar = _val + else: + _dims = len(self._save_var_list["meta"][_val]["vardims"]) + _modelvar = self._save_var_list["meta"][_val]["varvalue"] + _ncvar = self._save_var_list["meta"][_val]["varname"] + + # safety check, if None, replace with name of key (_val) + if _modelvar is None: + _modelvar = _val + # use knowledge of time-varying values to save them - if self._save_var_list["meta"][_val][0] is None: - self.output_netcdf["meta"][_val][save_idx] = getattr(self, _val) + if _dims > 2: + self.output_netcdf[self._subgroup_name][_ncvar][save_idx] = getattr( + self, _modelvar + ) + else: + pass # do not re-save values that do not change over time # -------------------- sync -------------------- if self._save_metadata or self._save_any_grids: diff --git a/pyDeltaRCM/model.py b/pyDeltaRCM/model.py index 790883f6..d8e6d4c9 100644 --- a/pyDeltaRCM/model.py +++ b/pyDeltaRCM/model.py @@ -839,7 +839,13 @@ def save_eta_grids(self) -> bool: @save_eta_grids.setter def save_eta_grids(self, save_eta_grids: bool) -> None: if (save_eta_grids is True) and ("eta" not in self._save_var_list.keys()): - self._save_var_list["eta"] = ["eta", "meters", "f4", self._netcdf_coords] + self._save_var_list["eta"] = [ + "eta", + "meters", + "f4", + self._netcdf_coords, + "channel_bottom__elevation", + ] elif (save_eta_grids is False) and ("eta" in self._save_var_list.keys()): del self._save_var_list["eta"] self._save_eta_grids = save_eta_grids @@ -859,6 +865,7 @@ def save_stage_grids(self, save_stage_grids: bool) -> None: "meters", "f4", self._netcdf_coords, + "channel_water_surface__elevation", ] elif (save_stage_grids is False) and ("stage" in self._save_var_list.keys()): del self._save_var_list["stage"] @@ -879,6 +886,7 @@ def save_depth_grids(self, save_depth_grids: bool) -> None: "meters", "f4", self._netcdf_coords, + "channel_water__thickness", ] elif (save_depth_grids is False) and ("depth" in self._save_var_list.keys()): del self._save_var_list["depth"] @@ -901,6 +909,7 @@ def save_discharge_grids(self, save_discharge_grids: bool) -> None: "cubic meters per second", "f4", self._netcdf_coords, + "channel_water_flowing__volume_rate", ] elif (save_discharge_grids is False) and ( "discharge" in self._save_var_list.keys() @@ -925,6 +934,7 @@ def save_velocity_grids(self, save_velocity_grids: bool) -> None: "meters per second", "f4", self._netcdf_coords, + "channel_water_flowing__speed", ] elif (save_velocity_grids is False) and ( "velocity" in self._save_var_list.keys() @@ -949,6 +959,7 @@ def save_sedflux_grids(self, save_sedflux_grids: bool) -> None: "cubic meters per second", "f4", self._netcdf_coords, + "channel_water_sediment_flowing__volume_rate", ] elif (save_sedflux_grids is False) and ( "sedflux" in self._save_var_list.keys() @@ -974,6 +985,7 @@ def save_sandfrac_grids(self, save_sandfrac_grids: bool) -> None: "fraction", "f4", self._netcdf_coords, + "channel_bottom_sediment_sand__volume_fraction", ] elif (save_sandfrac_grids is False) and ( "sandfrac" in self._save_var_list.keys() @@ -997,6 +1009,7 @@ def save_discharge_components(self, save_discharge_components: bool) -> None: "cubic meters per second", "f4", self._netcdf_coords, + "channel_water_flowing__x_component_of_volume_rate", ] if "discharge_y" not in self._save_var_list.keys(): self._save_var_list["discharge_y"] = [ @@ -1004,6 +1017,7 @@ def save_discharge_components(self, save_discharge_components: bool) -> None: "cubic meters per second", "f4", self._netcdf_coords, + "channel_water_flowing__y_component_of_volume_rate", ] elif save_discharge_components is False: if "discharge_x" in self._save_var_list.keys(): @@ -1028,6 +1042,7 @@ def save_velocity_components(self, save_velocity_components: bool) -> None: "meters per second", "f4", self._netcdf_coords, + "channel_water_flowing__x_component_of_speed", ] if "velocity_y" not in self._save_var_list.keys(): self._save_var_list["velocity_y"] = [ @@ -1035,6 +1050,7 @@ def save_velocity_components(self, save_velocity_components: bool) -> None: "meters per second", "f4", self._netcdf_coords, + "channel_water_flowing__y_component_of_speed", ] elif save_velocity_components is False: if "velocity_x" in self._save_var_list.keys(): @@ -1330,49 +1346,41 @@ def legacy_netcdf(self) -> bool: """Enable output in legacy netCDF format. Default behavior, legacy_netcdf: False, is for the model to use the - `v2.1.0` output netCDF format. The updated format is configured - to match the input expected by `xarray`, which eases interaction with - model outputs. The change in format is from inconsistently named - dimensions and *coordinate variables*, to homogeneous definitions. - Also, the legacy format specified the variables `x` and `y` as 2d - grids, whereas the updated format uses 1d coordinate arrays. - - .. important:: - - The behavior of the legacy option, and the new format is expected - to change in version 2.2.0 With v2.2.0 the default output file - will comply with the sandsuet data specification, and the - `legacy_output=True` option will output the current - configuration. The core data will not change with v2.2, but the - names and attributes of components of the data output is expected - to change. - - +-------------+-------------------+---------------------------------+ - | | default | legacy | - +=============+===================+=================================+ - | dimensions | `time`, `x`, `y` | `total_time`, `length`, `width` | - +-------------+-------------------+---------------------------------+ - | variables | `time`, `x`, `y` | `time`, `y`, `x`; x, y as 2D | - +-------------+-------------------+---------------------------------+ - | data | `t-x-y` array | `t-y-x` array | - +-------------+-------------------+---------------------------------+ + output netCDF format established in `v2.2.0`. + + The `v2.2.0` format is configured to match the input expected by + `xarray` and in compliance with the *sandsuet* data specification. + The legacy format(`legacy_netcdf=True`) provides the `v2.1.9` + and earlier specification. + + +-------------+----------------------+---------------------+ + | | default | legacy | + +=============+======================+=====================+ + | dimensions | `seconds`, `x`, `y` | `time`, `x`, `y` | + +-------------+----------------------+---------------------+ + | variables | `seconds`, `x`, `y` | `time`, `x`, `y` | + +-------------+----------------------+---------------------+ + | data | `t-x-y` array | `t-x-y` arrays | + +-------------+----------------------+---------------------+ + + The major differences are in the naming of output dimensions, and the + requirment that the file **must** meet sandsuet specifications if + `legacy_netcdf=True`. This requires that all variables include in + metadata a description of the variable, which is saved in the + `long_name` attribute of the netCDF variable. .. hint:: - If you are beginning a new project, use `legacy_netcdf == False`, - and update scripts accordingly. + If you are beginning a new project, use `legacy_netcdf=False`, and + update any old scripts or model classes accordingly. The old + behavior is likely to be deprecated in the future! """ + # DEV NOTE: do not change legacy output behavior prior to v2.3.0, + # after which it can deprecated or changed again. return self._legacy_netcdf @legacy_netcdf.setter def legacy_netcdf(self, legacy_netcdf: bool) -> None: - if legacy_netcdf: - warnings.warn( - "The legacy version of the NetCDF output is " - "expected to change with v2.2. The old `legagcy` " - "file format will no longer be available, and " - "will be replaced by the current file format." - ) self._legacy_netcdf = legacy_netcdf @property diff --git a/tests/integration/test_checkpointing.py b/tests/integration/test_checkpointing.py index 862cfaaa..47dad4c8 100644 --- a/tests/integration/test_checkpointing.py +++ b/tests/integration/test_checkpointing.py @@ -18,8 +18,9 @@ @mock.patch( - 'pyDeltaRCM.iteration_tools.iteration_tools.solve_water_and_sediment_timestep', - new=utilities.FastIteratingDeltaModel.solve_water_and_sediment_timestep) + "pyDeltaRCM.iteration_tools.iteration_tools.solve_water_and_sediment_timestep", + new=utilities.FastIteratingDeltaModel.solve_water_and_sediment_timestep, +) class TestCheckpointingIntegrations: """ The above patch implements an augmented DeltaModel from `utilities`. In @@ -36,10 +37,10 @@ def test_simple_checkpoint(self, tmp_path: Path) -> None: Also, checks resumed model against another checkpoint run. """ # define a yaml for the longer model run - file_name = 'base_run.yaml' + file_name = "base_run.yaml" base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(base_f, 'out_dir', tmp_path / 'test') - utilities.write_parameter_to_file(base_f, 'save_checkpoint', True) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") + utilities.write_parameter_to_file(base_f, "save_checkpoint", True) base_f.close() longModel = DeltaModel(input_file=base_p) @@ -49,10 +50,10 @@ def test_simple_checkpoint(self, tmp_path: Path) -> None: longModel.finalize() # try defining a new model but plan to load checkpoint from longModel - file_name = 'base_run.yaml' + file_name = "base_run.yaml" base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(base_f, 'out_dir', tmp_path / 'test') - utilities.write_parameter_to_file(base_f, 'resume_checkpoint', True) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") + utilities.write_parameter_to_file(base_f, "resume_checkpoint", True) base_f.close() resumeModel = DeltaModel(input_file=base_p) @@ -74,10 +75,10 @@ def test_simple_checkpoint(self, tmp_path: Path) -> None: assert np.all(longModel.active_layer == resumeModel.active_layer) # define another model that loads the checkpoint - file_name = 'base_run.yaml' + file_name = "base_run.yaml" base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(base_f, 'out_dir', tmp_path / 'test') - utilities.write_parameter_to_file(base_f, 'resume_checkpoint', True) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") + utilities.write_parameter_to_file(base_f, "resume_checkpoint", True) base_f.close() resumeModel2 = DeltaModel(input_file=base_p) @@ -99,14 +100,14 @@ def test_simple_checkpoint(self, tmp_path: Path) -> None: def test_checkpoint_nc(self, tmp_path: Path) -> None: """Test the netCDF that is written to by the checkpointing.""" # define a yaml for the base model run - file_name = 'base_run.yaml' + file_name = "base_run.yaml" base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(base_f, 'out_dir', tmp_path / 'test') - utilities.write_parameter_to_file(base_f, 'save_eta_grids', True) - utilities.write_parameter_to_file(base_f, 'save_depth_grids', True) - utilities.write_parameter_to_file(base_f, 'save_discharge_grids', True) - utilities.write_parameter_to_file(base_f, 'save_sandfrac_grids', True) - utilities.write_parameter_to_file(base_f, 'save_checkpoint', True) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") + utilities.write_parameter_to_file(base_f, "save_eta_grids", True) + utilities.write_parameter_to_file(base_f, "save_depth_grids", True) + utilities.write_parameter_to_file(base_f, "save_discharge_grids", True) + utilities.write_parameter_to_file(base_f, "save_sandfrac_grids", True) + utilities.write_parameter_to_file(base_f, "save_checkpoint", True) base_f.close() baseModel = DeltaModel(input_file=base_p) @@ -117,7 +118,7 @@ def test_checkpoint_nc(self, tmp_path: Path) -> None: # force the model run to end immmediately after exporting a checkpoint nt_var = 0 - while (baseModel._save_time_since_checkpoint != 0): + while baseModel._save_time_since_checkpoint != 0: baseModel.update() nt_var += 1 @@ -128,15 +129,15 @@ def test_checkpoint_nc(self, tmp_path: Path) -> None: assert baseModel.time == baseModel._dt * (nt_base + nt_var) # try defining a new model but plan to load checkpoint from baseModel - file_name = 'base_run.yaml' + file_name = "base_run.yaml" base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(base_f, 'out_dir', tmp_path / 'test') - utilities.write_parameter_to_file(base_f, 'save_eta_grids', True) - utilities.write_parameter_to_file(base_f, 'save_depth_grids', True) - utilities.write_parameter_to_file(base_f, 'save_discharge_grids', True) - utilities.write_parameter_to_file(base_f, 'save_sandfrac_grids', True) - utilities.write_parameter_to_file(base_f, 'save_checkpoint', False) - utilities.write_parameter_to_file(base_f, 'resume_checkpoint', True) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") + utilities.write_parameter_to_file(base_f, "save_eta_grids", True) + utilities.write_parameter_to_file(base_f, "save_depth_grids", True) + utilities.write_parameter_to_file(base_f, "save_discharge_grids", True) + utilities.write_parameter_to_file(base_f, "save_sandfrac_grids", True) + utilities.write_parameter_to_file(base_f, "save_checkpoint", False) + utilities.write_parameter_to_file(base_f, "resume_checkpoint", True) base_f.close() resumeModel = DeltaModel(input_file=base_p) @@ -153,82 +154,140 @@ def test_checkpoint_nc(self, tmp_path: Path) -> None: assert resumeModel.time > baseModel.time # assert that output netCDF4 exists - exp_path_nc = os.path.join(tmp_path / 'test', 'pyDeltaRCM_output.nc') + exp_path_nc = os.path.join(tmp_path / "test", "pyDeltaRCM_output.nc") assert os.path.isfile(exp_path_nc) # load it into memory and check values in the netCDF4 - output = Dataset(exp_path_nc, 'r', allow_pickle=True) + output = Dataset(exp_path_nc, "r", allow_pickle=True) out_vars = output.variables.keys() # check that expected variables are in the file - assert 'x' in out_vars - assert 'y' in out_vars - assert 'time' in out_vars - assert 'eta' in out_vars - assert 'depth' in out_vars - assert 'discharge' in out_vars - assert 'sandfrac' in out_vars + assert "x" in out_vars + assert "y" in out_vars + assert "seconds" in out_vars + assert "eta" in out_vars + assert "depth" in out_vars + assert "discharge" in out_vars + assert "sandfrac" in out_vars # check attributes of variables - assert output['time'][0].tolist() == 0.0 - assert output['time'][-1] == resumeModel.time - assert output['time'][-1].tolist() == resumeModel._dt * \ - (nt_base + nt_var + nt_resume) - assert output['eta'][0].shape == resumeModel.eta.shape - assert output['eta'][-1].shape == resumeModel.eta.shape - assert output['depth'][-1].shape == resumeModel.eta.shape - assert output['discharge'][-1].shape == resumeModel.eta.shape - assert output['sandfrac'][-1].shape == resumeModel.eta.shape + assert output["seconds"][0].tolist() == 0.0 + assert output["seconds"][-1] == resumeModel.time + assert output["seconds"][-1].tolist() == resumeModel._dt * ( + nt_base + nt_var + nt_resume + ) + assert output["eta"][0].shape == resumeModel.eta.shape + assert output["eta"][-1].shape == resumeModel.eta.shape + assert output["depth"][-1].shape == resumeModel.eta.shape + assert output["discharge"][-1].shape == resumeModel.eta.shape + assert output["sandfrac"][-1].shape == resumeModel.eta.shape # check the metadata - assert output['meta']['L0'][:] == resumeModel.L0 - assert output['meta']['N0'][:] == resumeModel.N0 - assert output['meta']['CTR'][:] == resumeModel.CTR - assert output['meta']['dx'][:] == resumeModel.dx - assert output['meta']['h0'][:] == resumeModel.h0 - assert np.all(output['meta']['cell_type'][:] == resumeModel.cell_type) - assert output['meta']['H_SL'][-1].data == resumeModel.H_SL - assert output['meta']['f_bedload'][-1].data == resumeModel.f_bedload - C0_from_file = float(output['meta']['C0_percent'][-1].data) + assert output["auxdata"]["L0"][:] == resumeModel.L0 + assert output["auxdata"]["N0"][:] == resumeModel.N0 + assert output["auxdata"]["CTR"][:] == resumeModel.CTR + assert output["auxdata"]["dx"][:] == resumeModel.dx + assert output["auxdata"]["h0"][:] == resumeModel.h0 + assert np.all(output["auxdata"]["cell_type"][:] == resumeModel.cell_type) + assert output["auxdata"]["H_SL"][-1].data == resumeModel.H_SL + assert output["auxdata"]["f_bedload"][-1].data == resumeModel.f_bedload + C0_from_file = float(output["auxdata"]["C0_percent"][-1].data) assert pytest.approx(C0_from_file) == resumeModel.C0_percent - assert output['meta']['u0'][-1].data == resumeModel.u0 + assert output["auxdata"]["u0"][-1].data == resumeModel.u0 # checkpoint interval aligns w/ timestep dt so these should match - assert output['time'][-1].tolist() == resumeModel.time + assert output["seconds"][-1].tolist() == resumeModel.time + + def test_checkpoint_nc_legacy(self, tmp_path: Path) -> None: + """Test the netCDF that is written to by the checkpointing.""" + # define a yaml for the base model run + file_name = "base_run.yaml" + base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") + utilities.write_parameter_to_file(base_f, "save_eta_grids", True) + utilities.write_parameter_to_file(base_f, "legacy_netcdf", True) + utilities.write_parameter_to_file(base_f, "save_checkpoint", True) + base_f.close() + baseModel = DeltaModel(input_file=base_p) + + # run for some base number of steps + nt_base = 50 + for _ in range(0, 50): + baseModel.update() + + # force the model run to end immmediately after exporting a checkpoint + nt_var = 0 + while baseModel._save_time_since_checkpoint != 0: + baseModel.update() + nt_var += 1 + + # then finalize + baseModel.finalize() + + # try defining a new model but plan to load checkpoint from baseModel + file_name = "base_run.yaml" + base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") + utilities.write_parameter_to_file(base_f, "save_eta_grids", True) + utilities.write_parameter_to_file(base_f, "legacy_netcdf", True) + utilities.write_parameter_to_file(base_f, "save_checkpoint", False) + utilities.write_parameter_to_file(base_f, "resume_checkpoint", True) + base_f.close() + resumeModel = DeltaModel(input_file=base_p) + + assert resumeModel.time == baseModel.time # same when resumed + + # advance it until output_data has been called again + nt_resume = 0 + while (resumeModel._save_time_since_data != 0) or (nt_resume < 50): + resumeModel.update() + nt_resume += 1 + resumeModel.finalize() + + # assert that output netCDF4 exists + exp_path_nc = os.path.join(tmp_path / "test", "pyDeltaRCM_output.nc") + assert os.path.isfile(exp_path_nc) + + # load it into memory and check values in the netCDF4 + output = Dataset(exp_path_nc, "r", allow_pickle=True) + out_vars = output.variables.keys() + + # check only the things expected to be diff in legacy + assert "time" in out_vars + assert "meta" in output.groups.keys() def test_checkpoint_diff_dt(self, tmp_path: Path) -> None: """Test when checkpoint_dt does not match dt or save_dt.""" # define a yaml for the base model run - file_name = 'base_run.yaml' + file_name = "base_run.yaml" base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(base_f, 'save_eta_grids', True) - utilities.write_parameter_to_file(base_f, 'save_depth_grids', True) - utilities.write_parameter_to_file(base_f, 'save_discharge_grids', True) - utilities.write_parameter_to_file(base_f, 'save_checkpoint', True) - utilities.write_parameter_to_file(base_f, 'out_dir', tmp_path / 'test') + utilities.write_parameter_to_file(base_f, "save_eta_grids", True) + utilities.write_parameter_to_file(base_f, "save_depth_grids", True) + utilities.write_parameter_to_file(base_f, "save_discharge_grids", True) + utilities.write_parameter_to_file(base_f, "save_checkpoint", True) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") base_f.close() baseModel = DeltaModel(input_file=base_p) # modify the checkpoint dt to be different than save_dt - baseModel._checkpoint_dt = (baseModel.save_dt * 0.65) + baseModel._checkpoint_dt = baseModel.save_dt * 0.65 for _ in range(0, 50): baseModel.update() baseModel.finalize() assert baseModel.time == baseModel._dt * 50 - baseModelSavedTime = (baseModel.time - - baseModel._save_time_since_checkpoint) + baseModelSavedTime = baseModel.time - baseModel._save_time_since_checkpoint assert baseModelSavedTime > 0 # try defining a new model but plan to load checkpoint from baseModel - file_name = 'base_run.yaml' + file_name = "base_run.yaml" base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(base_f, 'save_eta_grids', True) - utilities.write_parameter_to_file(base_f, 'save_depth_grids', True) - utilities.write_parameter_to_file(base_f, 'save_discharge_grids', True) - utilities.write_parameter_to_file(base_f, 'save_checkpoint', False) - utilities.write_parameter_to_file(base_f, 'resume_checkpoint', True) - utilities.write_parameter_to_file(base_f, 'out_dir', tmp_path / 'test') + utilities.write_parameter_to_file(base_f, "save_eta_grids", True) + utilities.write_parameter_to_file(base_f, "save_depth_grids", True) + utilities.write_parameter_to_file(base_f, "save_discharge_grids", True) + utilities.write_parameter_to_file(base_f, "save_checkpoint", False) + utilities.write_parameter_to_file(base_f, "resume_checkpoint", True) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") base_f.close() resumeModel = DeltaModel(input_file=base_p) @@ -242,31 +301,31 @@ def test_checkpoint_diff_dt(self, tmp_path: Path) -> None: resumeModel.finalize() # assert that output netCDF4 exists - exp_path_nc = os.path.join(tmp_path / 'test', 'pyDeltaRCM_output.nc') + exp_path_nc = os.path.join(tmp_path / "test", "pyDeltaRCM_output.nc") assert os.path.isfile(exp_path_nc) # load it into memory and check values in the netCDF4 - output = Dataset(exp_path_nc, 'r', allow_pickle=True) + output = Dataset(exp_path_nc, "r", allow_pickle=True) out_vars = output.variables.keys() # check that expected variables are in the file - assert 'x' in out_vars - assert 'y' in out_vars - assert 'time' in out_vars - assert 'eta' in out_vars - assert 'depth' in out_vars - assert 'discharge' in out_vars + assert "x" in out_vars + assert "y" in out_vars + assert "seconds" in out_vars + assert "eta" in out_vars + assert "depth" in out_vars + assert "discharge" in out_vars # check attributes of variables - assert output['time'][0].tolist() == 0.0 - assert output['time'][-1].tolist() == resumeModel.time + assert output["seconds"][0].tolist() == 0.0 + assert output["seconds"][-1].tolist() == resumeModel.time def test_multi_checkpoints(self, tmp_path: Path) -> None: """Test using checkpoints multiple times for a given model run.""" # define a yaml for the base model run - file_name = 'base_run.yaml' + file_name = "base_run.yaml" base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(base_f, 'save_eta_grids', True) - utilities.write_parameter_to_file(base_f, 'save_checkpoint', True) - utilities.write_parameter_to_file(base_f, 'out_dir', tmp_path / 'test') + utilities.write_parameter_to_file(base_f, "save_eta_grids", True) + utilities.write_parameter_to_file(base_f, "save_checkpoint", True) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") base_f.close() baseModel = DeltaModel(input_file=base_p) @@ -276,12 +335,12 @@ def test_multi_checkpoints(self, tmp_path: Path) -> None: baseModel.finalize() # try defining a new model but plan to load checkpoint from baseModel - file_name = 'base_run.yaml' + file_name = "base_run.yaml" base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(base_f, 'save_eta_grids', True) - utilities.write_parameter_to_file(base_f, 'save_checkpoint', True) - utilities.write_parameter_to_file(base_f, 'resume_checkpoint', True) - utilities.write_parameter_to_file(base_f, 'out_dir', tmp_path / 'test') + utilities.write_parameter_to_file(base_f, "save_eta_grids", True) + utilities.write_parameter_to_file(base_f, "save_checkpoint", True) + utilities.write_parameter_to_file(base_f, "resume_checkpoint", True) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") base_f.close() resumeModel = DeltaModel(input_file=base_p) @@ -304,28 +363,28 @@ def test_multi_checkpoints(self, tmp_path: Path) -> None: nt_resume02 += 1 # assert that output netCDF4 exists - exp_path_nc = os.path.join(tmp_path / 'test', 'pyDeltaRCM_output.nc') + exp_path_nc = os.path.join(tmp_path / "test", "pyDeltaRCM_output.nc") assert os.path.isfile(exp_path_nc) # load it into memory and check values in the netCDF4 - output = Dataset(exp_path_nc, 'r', allow_pickle=True) + output = Dataset(exp_path_nc, "r", allow_pickle=True) out_vars = output.variables.keys() # check that expected variables are in the file - assert 'x' in out_vars - assert 'y' in out_vars - assert 'time' in out_vars - assert 'eta' in out_vars + assert "x" in out_vars + assert "y" in out_vars + assert "seconds" in out_vars + assert "eta" in out_vars # check attributes of variables - assert output['time'][0].tolist() == 0.0 - assert output['time'][-1].tolist() == resumeModel02.time + assert output["seconds"][0].tolist() == 0.0 + assert output["seconds"][-1].tolist() == resumeModel02.time def test_load_nocheckpoint(self, tmp_path: Path) -> None: """Try loading a checkpoint file when one doesn't exist.""" # define a yaml - file_name = 'trial_run.yaml' + file_name = "trial_run.yaml" base_p, base_f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(base_f, 'resume_checkpoint', True) - utilities.write_parameter_to_file(base_f, 'out_dir', tmp_path / 'test') + utilities.write_parameter_to_file(base_f, "resume_checkpoint", True) + utilities.write_parameter_to_file(base_f, "out_dir", tmp_path / "test") base_f.close() # try loading the model yaml despite no checkpoint existing @@ -333,17 +392,17 @@ def test_load_nocheckpoint(self, tmp_path: Path) -> None: _ = DeltaModel(input_file=base_p) @pytest.mark.skipif( - platform.system() != 'Linux', - reason='Parallel support only on Linux OS.') + platform.system() != "Linux", reason="Parallel support only on Linux OS." + ) def test_py_hlvl_parallel_checkpoint(self, tmp_path: Path) -> None: """Test checkpointing in parallel.""" - file_name = 'user_parameters.yaml' + file_name = "user_parameters.yaml" p, f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(f, 'ensemble', 2) - utilities.write_parameter_to_file(f, 'out_dir', tmp_path / 'test') - utilities.write_parameter_to_file(f, 'parallel', 2) - utilities.write_parameter_to_file(f, 'save_checkpoint', True) - utilities.write_parameter_to_file(f, 'save_eta_grids', True) + utilities.write_parameter_to_file(f, "ensemble", 2) + utilities.write_parameter_to_file(f, "out_dir", tmp_path / "test") + utilities.write_parameter_to_file(f, "parallel", 2) + utilities.write_parameter_to_file(f, "save_checkpoint", True) + utilities.write_parameter_to_file(f, "save_eta_grids", True) f.close() pp = preprocessor.Preprocessor(input_file=p, timesteps=50) # assertions for job creation @@ -356,43 +415,44 @@ def test_py_hlvl_parallel_checkpoint(self, tmp_path: Path) -> None: # compute the expected final time recorded _dt = pp.job_list[1].deltamodel._dt _checkpoint_dt = pp.job_list[1].deltamodel._checkpoint_dt - expected_save_interval = (((_checkpoint_dt // _dt) + 1) * _dt) - expected_last_save_time = (((50 * _dt) // expected_save_interval) * - expected_save_interval) + expected_save_interval = ((_checkpoint_dt // _dt) + 1) * _dt + expected_last_save_time = ( + (50 * _dt) // expected_save_interval + ) * expected_save_interval # assertions after running jobs assert isinstance(pp.job_list[0], preprocessor._ParallelJob) assert pp._is_completed is True exp_path_nc0 = os.path.join( - tmp_path / 'test', 'job_000', 'pyDeltaRCM_output.nc') + tmp_path / "test", "job_000", "pyDeltaRCM_output.nc" + ) exp_path_nc1 = os.path.join( - tmp_path / 'test', 'job_001', 'pyDeltaRCM_output.nc') + tmp_path / "test", "job_001", "pyDeltaRCM_output.nc" + ) assert os.path.isfile(exp_path_nc0) assert os.path.isfile(exp_path_nc1) # check that checkpoint files exist - exp_path_ckpt0 = os.path.join( - tmp_path / 'test', 'job_000', 'checkpoint.npz') - exp_path_ckpt1 = os.path.join( - tmp_path / 'test', 'job_001', 'checkpoint.npz') + exp_path_ckpt0 = os.path.join(tmp_path / "test", "job_000", "checkpoint.npz") + exp_path_ckpt1 = os.path.join(tmp_path / "test", "job_001", "checkpoint.npz") assert os.path.isfile(exp_path_ckpt0) assert os.path.isfile(exp_path_ckpt1) # load one output files and check values out_old = Dataset(exp_path_nc1) - assert 'meta' in out_old.groups.keys() - assert out_old['time'][0].tolist() == 0.0 - assert out_old['time'][-1].tolist() == expected_last_save_time + assert "auxdata" in out_old.groups.keys() + assert out_old["seconds"][0].tolist() == 0.0 + assert out_old["seconds"][-1].tolist() == expected_last_save_time # close netCDF file out_old.close() # try to resume jobs - file_name = 'user_parameters.yaml' + file_name = "user_parameters.yaml" p, f = utilities.create_temporary_file(tmp_path, file_name) - utilities.write_parameter_to_file(f, 'ensemble', 2) - utilities.write_parameter_to_file(f, 'out_dir', tmp_path / 'test') - utilities.write_parameter_to_file(f, 'parallel', 2) - utilities.write_parameter_to_file(f, 'resume_checkpoint', True) - utilities.write_parameter_to_file(f, 'save_eta_grids', True) + utilities.write_parameter_to_file(f, "ensemble", 2) + utilities.write_parameter_to_file(f, "out_dir", tmp_path / "test") + utilities.write_parameter_to_file(f, "parallel", 2) + utilities.write_parameter_to_file(f, "resume_checkpoint", True) + utilities.write_parameter_to_file(f, "save_eta_grids", True) f.close() pp = preprocessor.Preprocessor(input_file=p, timesteps=50) # assertions for job creation @@ -406,36 +466,34 @@ def test_py_hlvl_parallel_checkpoint(self, tmp_path: Path) -> None: assert isinstance(pp.job_list[0], preprocessor._ParallelJob) assert pp._is_completed is True exp_path_nc0 = os.path.join( - tmp_path / 'test', 'job_000', 'pyDeltaRCM_output.nc') + tmp_path / "test", "job_000", "pyDeltaRCM_output.nc" + ) exp_path_nc1 = os.path.join( - tmp_path / 'test', 'job_001', 'pyDeltaRCM_output.nc') + tmp_path / "test", "job_001", "pyDeltaRCM_output.nc" + ) assert os.path.isfile(exp_path_nc0) assert os.path.isfile(exp_path_nc1) # check that checkpoint files still exist - exp_path_ckpt0 = os.path.join( - tmp_path / 'test', 'job_000', 'checkpoint.npz') - exp_path_ckpt1 = os.path.join( - tmp_path / 'test', 'job_001', 'checkpoint.npz') + exp_path_ckpt0 = os.path.join(tmp_path / "test", "job_000", "checkpoint.npz") + exp_path_ckpt1 = os.path.join(tmp_path / "test", "job_001", "checkpoint.npz") assert os.path.isfile(exp_path_ckpt0) assert os.path.isfile(exp_path_ckpt1) # load one output file to check it out out_fin = Dataset(exp_path_nc1) - assert 'meta' in out_old.groups.keys() - assert out_fin['time'][0].tolist() == 0 - assert out_fin['time'][-1].tolist() == expected_last_save_time * 2 + assert "auxdata" in out_old.groups.keys() + assert out_fin["seconds"][0].tolist() == 0 + assert out_fin["seconds"][-1].tolist() == expected_last_save_time * 2 # close netcdf file out_fin.close() class TestCheckpointingCreatingLoading: - def test_load_checkpoint_with_netcdf(self, tmp_path: Path) -> None: - """Test that a run can be resumed when there are outputs. - """ + """Test that a run can be resumed when there are outputs.""" # define a yaml with outputs (defaults will output strata) - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True, - 'save_eta_grids': True}) + p = utilities.yaml_from_dict( + tmp_path, "input.yaml", {"save_checkpoint": True, "save_eta_grids": True} + ) _delta = DeltaModel(input_file=p) # replace eta with a random field for checkpointing success check @@ -447,27 +505,23 @@ def test_load_checkpoint_with_netcdf(self, tmp_path: Path) -> None: _delta.finalize() # paths exists - assert os.path.isfile(os.path.join( - _delta.prefix, 'pyDeltaRCM_output.nc')) - assert os.path.isfile(os.path.join( - _delta.prefix, 'checkpoint.npz')) + assert os.path.isfile(os.path.join(_delta.prefix, "pyDeltaRCM_output.nc")) + assert os.path.isfile(os.path.join(_delta.prefix, "checkpoint.npz")) _delta = [] # clear # can be resumed - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True, - 'resume_checkpoint': True}) + p = utilities.yaml_from_dict( + tmp_path, "input.yaml", {"save_checkpoint": True, "resume_checkpoint": True} + ) _delta = DeltaModel(input_file=p) # check that fields match assert np.all(_delta.eta == _rand_field) def test_create_checkpoint_without_netcdf(self, tmp_path: Path) -> None: - """Test that a checkpoint can be created when there are no outputs - """ + """Test that a checkpoint can be created when there are no outputs""" # define a yaml with NO outputs, but checkpoint - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True}) + p = utilities.yaml_from_dict(tmp_path, "input.yaml", {"save_checkpoint": True}) _delta = DeltaModel(input_file=p) @@ -480,24 +534,21 @@ def test_create_checkpoint_without_netcdf(self, tmp_path: Path) -> None: _delta.finalize() # should be no file - assert not os.path.isfile(os.path.join( - _delta.prefix, 'pyDeltaRCM_output.nc')) + assert not os.path.isfile(os.path.join(_delta.prefix, "pyDeltaRCM_output.nc")) # can be resumed - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True, - 'resume_checkpoint': True}) + p = utilities.yaml_from_dict( + tmp_path, "input.yaml", {"save_checkpoint": True, "resume_checkpoint": True} + ) _delta = DeltaModel(input_file=p) # check that fields match assert np.all(_delta.eta == _rand_field) def test_load_checkpoint_without_netcdf(self, tmp_path: Path) -> None: - """Test that a run can be resumed when there are outputs. - """ + """Test that a run can be resumed when there are outputs.""" # define a yaml with NO outputs, but checkpoint - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True}) + p = utilities.yaml_from_dict(tmp_path, "input.yaml", {"save_checkpoint": True}) _delta = DeltaModel(input_file=p) # replace eta with a random field for checkpointing success check @@ -509,43 +560,47 @@ def test_load_checkpoint_without_netcdf(self, tmp_path: Path) -> None: _delta.finalize() # should be no nc file but should be a checkpoint file - assert not os.path.isfile(os.path.join( - _delta.prefix, 'pyDeltaRCM_output.nc')) - assert os.path.isfile(os.path.join( - _delta.prefix, 'checkpoint.npz')) + assert not os.path.isfile(os.path.join(_delta.prefix, "pyDeltaRCM_output.nc")) + assert os.path.isfile(os.path.join(_delta.prefix, "checkpoint.npz")) # now try to resume, will WARN on not finding netcdf - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True, - 'save_eta_grids': True, - 'resume_checkpoint': True}) - with pytest.warns(UserWarning, match=r'NetCDF4 output *.'): + p = utilities.yaml_from_dict( + tmp_path, + "input.yaml", + { + "save_checkpoint": True, + "save_eta_grids": True, + "resume_checkpoint": True, + }, + ) + with pytest.warns(UserWarning, match=r"NetCDF4 output *."): _delta = DeltaModel(input_file=p) # assert that a new output file exists file exists - assert os.path.isfile(os.path.join( - _delta.prefix, 'pyDeltaRCM_output.nc')) - assert os.path.isfile(os.path.join( - _delta.prefix, 'checkpoint.npz')) + assert os.path.isfile(os.path.join(_delta.prefix, "pyDeltaRCM_output.nc")) + assert os.path.isfile(os.path.join(_delta.prefix, "checkpoint.npz")) # check that fields match assert np.all(_delta.eta == _rand_field) @mock.patch( - 'pyDeltaRCM.iteration_tools.iteration_tools.solve_water_and_sediment_timestep', - new=utilities.FastIteratingDeltaModel.solve_water_and_sediment_timestep) + "pyDeltaRCM.iteration_tools.iteration_tools.solve_water_and_sediment_timestep", + new=utilities.FastIteratingDeltaModel.solve_water_and_sediment_timestep, + ) @pytest.mark.skipif( - platform.system() != 'Linux', - reason='Parallel support only on Linux OS.') - def test_load_ckpt_wo_netcdf_parallel_spinup_to_matrix(self, tmp_path: Path) -> None: + platform.system() != "Linux", reason="Parallel support only on Linux OS." + ) + def test_load_ckpt_wo_netcdf_parallel_spinup_to_matrix( + self, tmp_path: Path + ) -> None: """ Test that multiple matrix runs can be resumed from a single checkpoint file, and take advantage of the preprocessor parallel infrastructure. """ # define a yaml with an output and checkpoint - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True, - 'save_eta_grids': True}) + p = utilities.yaml_from_dict( + tmp_path, "input.yaml", {"save_checkpoint": True, "save_eta_grids": True} + ) baseModel = DeltaModel(input_file=p) # run base for 2 timesteps @@ -558,39 +613,37 @@ def test_load_ckpt_wo_netcdf_parallel_spinup_to_matrix(self, tmp_path: Path) -> assert bmsi > 0 # check that files exist, and then delete nc - assert os.path.isfile(os.path.join( - baseModel.prefix, 'pyDeltaRCM_output.nc')) - assert os.path.isfile(os.path.join( - baseModel.prefix, 'checkpoint.npz')) + assert os.path.isfile(os.path.join(baseModel.prefix, "pyDeltaRCM_output.nc")) + assert os.path.isfile(os.path.join(baseModel.prefix, "checkpoint.npz")) # open the file and check dimensions - exp_nc_path = os.path.join(baseModel.prefix, 'pyDeltaRCM_output.nc') - output = Dataset(exp_nc_path, 'r', allow_pickle=True) + exp_nc_path = os.path.join(baseModel.prefix, "pyDeltaRCM_output.nc") + output = Dataset(exp_nc_path, "r", allow_pickle=True) out_vars = output.variables.keys() # check that expected variables are in the file - assert 'x' in out_vars - assert 'y' in out_vars - assert 'time' in out_vars - assert 'eta' in out_vars + assert "x" in out_vars + assert "y" in out_vars + assert "seconds" in out_vars + assert "eta" in out_vars # check attributes of variables - assert output['time'].shape[0] == bmsi + assert output["seconds"].shape[0] == bmsi ######################## # set up a matrix of runs - resume_dict = {'save_checkpoint': False, - 'resume_checkpoint': True, - 'save_eta_grids': True, - 'out_dir': os.path.join(tmp_path, 'matrix'), - 'parallel': 4} - _matrix = {'f_bedload': [0.1, 0.2, 0.5, 1]} - resume_dict['matrix'] = _matrix + resume_dict = { + "save_checkpoint": False, + "resume_checkpoint": True, + "save_eta_grids": True, + "out_dir": os.path.join(tmp_path, "matrix"), + "parallel": 4, + } + _matrix = {"f_bedload": [0.1, 0.2, 0.5, 1]} + resume_dict["matrix"] = _matrix # let the preprocessor write the initial matrix and # create a new output netcdf file - pp = preprocessor.Preprocessor( - resume_dict, - timesteps=25) # 2000 + pp = preprocessor.Preprocessor(resume_dict, timesteps=25) # 2000 # now copy the checkpoint for j, j_file in enumerate(pp.file_list): @@ -599,8 +652,9 @@ def test_load_ckpt_wo_netcdf_parallel_spinup_to_matrix(self, tmp_path: Path) -> # copy the spinup checkpoint to each of the folders shutil.copy( - src=os.path.join(baseModel.prefix, 'checkpoint.npz'), - dst=os.path.join(tmp_path, 'matrix', j_folder)) + src=os.path.join(baseModel.prefix, "checkpoint.npz"), + dst=os.path.join(tmp_path, "matrix", j_folder), + ) pp.run_jobs() @@ -610,7 +664,8 @@ def test_load_ckpt_wo_netcdf_parallel_spinup_to_matrix(self, tmp_path: Path) -> # proceed with assertions for j, j_job in enumerate(pp.job_list): exp_nc_path = os.path.join( - tmp_path, 'matrix', j_folder, 'pyDeltaRCM_output.nc') + tmp_path, "matrix", j_folder, "pyDeltaRCM_output.nc" + ) assert os.path.isfile(exp_nc_path) # check all jobs have same dimensionality @@ -618,23 +673,24 @@ def test_load_ckpt_wo_netcdf_parallel_spinup_to_matrix(self, tmp_path: Path) -> assert jsi == fjsi # open the file and check dimensions - output = Dataset(exp_nc_path, 'r', allow_pickle=True) + output = Dataset(exp_nc_path, "r", allow_pickle=True) out_vars = output.variables.keys() # check that expected variables are in the file - assert 'x' in out_vars - assert 'y' in out_vars - assert 'time' in out_vars - assert 'eta' in out_vars + assert "x" in out_vars + assert "y" in out_vars + assert "seconds" in out_vars + assert "eta" in out_vars # check attributes of variables # this is the critical check, that the dimension of the second - # netcdf is not expanded to begin at _save_ter from the initial + # netcdf is not expanded to begin at _save_iter from the initial # basemodel - assert output['time'].shape[0] <= (bmsi // 2) + assert output["seconds"].shape[0] <= (bmsi // 2) @pytest.mark.skipif( - platform.system() == 'Windows', - reason='OS differences regarding netCDF permissions.') + platform.system() == "Windows", + reason="OS differences regarding netCDF permissions.", + ) def test_load_checkpoint_with_open_netcdf(self, tmp_path: Path) -> None: """Test what happens if output netCDF file is actually open. @@ -642,9 +698,9 @@ def test_load_checkpoint_with_open_netcdf(self, tmp_path: Path) -> None: process. That situation raises an error for all OS. """ # define a yaml with outputs (defaults will output strata) - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True, - 'save_eta_grids': True}) + p = utilities.yaml_from_dict( + tmp_path, "input.yaml", {"save_checkpoint": True, "save_eta_grids": True} + ) _delta = DeltaModel(input_file=p) # replace eta with a random field for checkpointing success check @@ -656,26 +712,30 @@ def test_load_checkpoint_with_open_netcdf(self, tmp_path: Path) -> None: _delta.finalize() # paths exists - assert os.path.isfile(os.path.join( - _delta.prefix, 'pyDeltaRCM_output.nc')) - assert os.path.isfile(os.path.join( - _delta.prefix, 'checkpoint.npz')) + assert os.path.isfile(os.path.join(_delta.prefix, "pyDeltaRCM_output.nc")) + assert os.path.isfile(os.path.join(_delta.prefix, "checkpoint.npz")) _prefix = _delta.prefix _delta = [] # clear # open the netCDF file - _opened = Dataset(os.path.join(_prefix, 'pyDeltaRCM_output.nc'), - 'r+', format='NETCDF4') + _opened = Dataset( + os.path.join(_prefix, "pyDeltaRCM_output.nc"), "r+", format="NETCDF4" + ) assert type(_opened) == Dataset - assert 'eta' in _opened.variables.keys() + assert "eta" in _opened.variables.keys() # saved grid is the initial one, before random field was assigned - assert np.all(_opened.variables['eta'][:].data != _rand_field) + assert np.all(_opened.variables["eta"][:].data != _rand_field) # can be resumed - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True, - 'resume_checkpoint': True, - 'save_eta_grids': True}) + p = utilities.yaml_from_dict( + tmp_path, + "input.yaml", + { + "save_checkpoint": True, + "resume_checkpoint": True, + "save_eta_grids": True, + }, + ) _delta = DeltaModel(input_file=p) # force save grids/figs _delta.save_grids_and_figs() @@ -684,24 +744,26 @@ def test_load_checkpoint_with_open_netcdf(self, tmp_path: Path) -> None: assert np.all(_delta.eta == _rand_field) # assert that old netCDF object is still around and hasn't changed assert type(_opened) == Dataset - assert 'eta' in _opened.variables.keys() + assert "eta" in _opened.variables.keys() # grid from old netCDF is initial one, before random field was assigned - assert np.all(_opened.variables['eta'][:].data != _rand_field) + assert np.all(_opened.variables["eta"][:].data != _rand_field) # clear delta _delta = [] # open the new netCDF file - _new = Dataset(os.path.join(_prefix, 'pyDeltaRCM_output.nc'), - 'r+', format='NETCDF4') + _new = Dataset( + os.path.join(_prefix, "pyDeltaRCM_output.nc"), "r+", format="NETCDF4" + ) # first grid should be the OG one - assert np.all(_opened['eta'][:].data == _new['eta'][0, :, :].data) + assert np.all(_opened["eta"][:].data == _new["eta"][0, :, :].data) # random field should be saved in the new netCDF file # some rounding/truncation happens in the netCDF so we use approx - assert pytest.approx(_rand_field) == _new['eta'][1, :, :].data + assert pytest.approx(_rand_field) == _new["eta"][1, :, :].data @pytest.mark.skipif( - platform.system() != 'Windows', - reason='OS differences regarding netCDF permissions.') + platform.system() != "Windows", + reason="OS differences regarding netCDF permissions.", + ) def test_load_checkpoint_with_open_netcdf_win(self, tmp_path: Path) -> None: """Test what happens if output netCDF file is actually open. @@ -709,9 +771,9 @@ def test_load_checkpoint_with_open_netcdf_win(self, tmp_path: Path) -> None: process. That situation raises an error for all OS. """ # define a yaml with outputs (defaults will output strata) - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True, - 'save_eta_grids': True}) + p = utilities.yaml_from_dict( + tmp_path, "input.yaml", {"save_checkpoint": True, "save_eta_grids": True} + ) _delta = DeltaModel(input_file=p) # replace eta with a random field for checkpointing success check @@ -723,26 +785,30 @@ def test_load_checkpoint_with_open_netcdf_win(self, tmp_path: Path) -> None: _delta.finalize() # paths exists - assert os.path.isfile(os.path.join( - _delta.prefix, 'pyDeltaRCM_output.nc')) - assert os.path.isfile(os.path.join( - _delta.prefix, 'checkpoint.npz')) + assert os.path.isfile(os.path.join(_delta.prefix, "pyDeltaRCM_output.nc")) + assert os.path.isfile(os.path.join(_delta.prefix, "checkpoint.npz")) _prefix = _delta.prefix _delta = [] # clear # open the netCDF file - _opened = Dataset(os.path.join(_prefix, 'pyDeltaRCM_output.nc'), - 'r+', format='NETCDF4') + _opened = Dataset( + os.path.join(_prefix, "pyDeltaRCM_output.nc"), "r+", format="NETCDF4" + ) assert type(_opened) == Dataset - assert 'eta' in _opened.variables.keys() + assert "eta" in _opened.variables.keys() # saved grid is the initial one, before random field was assigned - assert np.all(_opened.variables['eta'][:].data != _rand_field) + assert np.all(_opened.variables["eta"][:].data != _rand_field) # can be resumed - p = utilities.yaml_from_dict(tmp_path, 'input.yaml', - {'save_checkpoint': True, - 'resume_checkpoint': True, - 'save_eta_grids': True}) + p = utilities.yaml_from_dict( + tmp_path, + "input.yaml", + { + "save_checkpoint": True, + "resume_checkpoint": True, + "save_eta_grids": True, + }, + ) # raises a permissions error on Windows with pytest.raises(PermissionError): _ = DeltaModel(input_file=p) diff --git a/tests/integration/test_consistent_outputs.py b/tests/integration/test_consistent_outputs.py index 8ad769a3..1ddca620 100644 --- a/tests/integration/test_consistent_outputs.py +++ b/tests/integration/test_consistent_outputs.py @@ -341,10 +341,10 @@ def test_same_models_diff_save_dt_singlesave(self, tmp_path: Path) -> None: assert ModelA["eta"][-1, :, :].shape == ModelB["eta"][-1, :, :].shape assert ModelA.variables.keys() == ModelB.variables.keys() # check a few pieces of metadata - assert ModelA["meta"]["L0"][:] == ModelB["meta"]["L0"][:] - assert ModelA["meta"].variables.keys() == ModelB["meta"].variables.keys() + assert ModelA["auxdata"]["L0"][:] == ModelB["auxdata"]["L0"][:] + assert ModelA["auxdata"].variables.keys() == ModelB["auxdata"].variables.keys() # final time should NOT be the same (because only initial time is saved in ModelB) - assert ModelA["time"][-1].data > ModelB["time"][-1].data + assert ModelA["seconds"][-1].data > ModelB["seconds"][-1].data # final eta grids should NOT be the same (because only initial eta is saved in ModelB) assert np.any(ModelA["eta"][-1, :, :].data != ModelB["eta"][-1, :, :].data) # this is seen in the difference in shape of the eta variable @@ -396,10 +396,10 @@ def test_same_models_diff_save_dt_saveend(self, tmp_path: Path) -> None: assert ModelA["eta"].shape == ModelB["eta"].shape assert ModelA.variables.keys() == ModelB.variables.keys() # check a few pieces of metadata - assert ModelA["meta"]["L0"][:] == ModelB["meta"]["L0"][:] - assert ModelA["meta"].variables.keys() == ModelB["meta"].variables.keys() + assert ModelA["auxdata"]["L0"][:] == ModelB["auxdata"]["L0"][:] + assert ModelA["auxdata"].variables.keys() == ModelB["auxdata"].variables.keys() # final time should be the same - assert ModelA["time"][-1].data == ModelB["time"][-1].data + assert ModelA["seconds"][-1].data == ModelB["seconds"][-1].data # final eta grids should be the same (because both initial eta is save in ModelB) assert np.all(ModelA["eta"][-1, :, :].data == ModelB["eta"][-1, :, :].data) # first eta grids should be the same (because both initial eta is the same) @@ -475,9 +475,9 @@ def test_same_models_in_serial_or_parallel(self, tmp_path: Path) -> None: assert ModelA_ser["eta"][-1, :, :].shape == ModelB_par["eta"][-1, :, :].shape # final time should be the same - assert ModelA_ser["time"][-1].data == ModelA_par["time"][-1].data - assert ModelB_par["time"][-1].data == ModelA_par["time"][-1].data - assert ModelB_par["time"][-1].data == ModelA_ser["time"][-1].data + assert ModelA_ser["seconds"][-1].data == ModelA_par["seconds"][-1].data + assert ModelB_par["seconds"][-1].data == ModelA_par["seconds"][-1].data + assert ModelB_par["seconds"][-1].data == ModelA_ser["seconds"][-1].data # final eta grids should be the same (because both initial eta is save in ModelB) assert np.all( ModelA_ser["eta"][-1, :, :].data == ModelA_par["eta"][-1, :, :].data diff --git a/tests/integration/test_timing_triggers.py b/tests/integration/test_timing_triggers.py index 8650688d..fcd0b55e 100644 --- a/tests/integration/test_timing_triggers.py +++ b/tests/integration/test_timing_triggers.py @@ -412,8 +412,8 @@ def test_save_metadata_no_grids(self, tmp_path: Path) -> None: ds = netCDF4.Dataset(exp_path_nc, "r", format="NETCDF4") assert not ("eta" in ds.variables) - assert ds["meta"]["H_SL"].shape[0] == 3 - assert ds["meta"]["L0"][:] == 3 + assert ds["auxdata"]["H_SL"].shape[0] == 3 + assert ds["auxdata"]["L0"][:] == 3 def test_save_subsidence_metadata_no_grids(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( @@ -449,11 +449,11 @@ def test_save_subsidence_metadata_no_grids(self, tmp_path: Path) -> None: ds = netCDF4.Dataset(exp_path_nc, "r", format="NETCDF4") assert not ("eta" in ds.variables) - assert ds["meta"]["H_SL"].shape[0] == 3 - assert ds["meta"]["L0"][:] == 3 - assert ds["meta"]["sigma"].shape == _delta.sigma.shape - assert np.all(ds["meta"]["sigma"] == _delta.sigma) - assert ds["meta"]["start_subsidence"][:] == 0 + assert ds["auxdata"]["H_SL"].shape[0] == 3 + assert ds["auxdata"]["L0"][:] == 3 + assert ds["auxdata"]["sigma"].shape == _delta.sigma.shape + assert np.all(ds["auxdata"]["sigma"] == _delta.sigma) + assert ds["auxdata"]["start_subsidence"][:] == 0 def test_save_one_grid_metadata_by_default(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( @@ -489,10 +489,10 @@ def test_save_one_grid_metadata_by_default(self, tmp_path: Path) -> None: _arr = ds.variables["eta"] assert _arr.shape[1] == _delta.eta.shape[0] assert _arr.shape[2] == _delta.eta.shape[1] - assert "meta" in ds.groups # if any grids, save meta too - assert ds.groups["meta"]["H_SL"].shape[0] == _arr.shape[0] - assert np.all(ds.groups["meta"]["C0_percent"][:].data == 0.2) - assert np.all(ds.groups["meta"]["f_bedload"][:].data == 0.5) + assert "auxdata" in ds.groups # if any grids, save meta too + assert ds.groups["auxdata"]["H_SL"].shape[0] == _arr.shape[0] + assert np.all(ds.groups["auxdata"]["C0_percent"][:].data == 0.2) + assert np.all(ds.groups["auxdata"]["f_bedload"][:].data == 0.5) class TestTimingStops: diff --git a/tests/test_init_tools.py b/tests/test_init_tools.py index 3ac43312..1ea93cfa 100644 --- a/tests/test_init_tools.py +++ b/tests/test_init_tools.py @@ -833,17 +833,17 @@ def test_U_ero_mud(self, tmp_path: Path) -> None: def test_L0(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( - tmp_path, "input.yaml", {"L0_meters": 100, "Length": 6000, "dx": 5} + tmp_path, "input.yaml", {"L0_meters": 100, "Length": 500, "dx": 5} ) _delta = DeltaModel(input_file=p) assert _delta.L0 == 20 def test_N0(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( - tmp_path, "input.yaml", {"N0_meters": 500, "Width": 6000, "dx": 5} + tmp_path, "input.yaml", {"N0_meters": 30, "Width": 100, "dx": 5} ) _delta = DeltaModel(input_file=p) - assert _delta.N0 == 100 + assert _delta.N0 == 5 def test_L(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict(tmp_path, "input.yaml", {"Length": 1600, "dx": 20}) @@ -894,7 +894,8 @@ def test_gamma(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( tmp_path, "input.yaml", {"S0": 0.01, "dx": 10, "u0": 3} ) - _delta = DeltaModel(input_file=p) + with pytest.warns(UserWarning, match=r"Gamma.*greater than.*"): + _delta = DeltaModel(input_file=p) assert _delta.gamma == pytest.approx(0.10900000) def test_V0(self, tmp_path: Path) -> None: @@ -913,9 +914,9 @@ def test_Qw0(self, tmp_path: Path) -> None: assert _delta.Qw0 == 800 def test_qw0(self, tmp_path: Path) -> None: - p = utilities.yaml_from_dict(tmp_path, "input.yaml", {"u0": 0.8, "h0": 3}) + p = utilities.yaml_from_dict(tmp_path, "input.yaml", {"u0": 3, "h0": 5}) _delta = DeltaModel(input_file=p) - assert _delta.qw0 == pytest.approx(2.4) + assert _delta.qw0 == pytest.approx(15) def test_Qp_water(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( @@ -1172,9 +1173,21 @@ def test_default_meta_list(self, tmp_path: Path) -> None: assert "meta" in delta._save_var_list.keys() # save meta on, so check that some expected values are there assert "L0" in delta._save_var_list["meta"].keys() - assert delta._save_var_list["meta"]["L0"] == ["L0", "cells", "i8", ()] + assert delta._save_var_list["meta"]["L0"] == [ + "L0", + "cells", + "i8", + (), + "channel_entrance__length", + ] assert "H_SL" in delta._save_var_list["meta"].keys() - assert delta._save_var_list["meta"]["H_SL"] == [None, "meters", "f4", "time"] + assert delta._save_var_list["meta"]["H_SL"] == [ + "H_SL", + "meters", + "f4", + "seconds", + "basin_water_surface__elevation", + ] def test_default_meta_list_legacy(self, tmp_path: Path) -> None: file_name = "user_parameters.yaml" @@ -1183,20 +1196,28 @@ def test_default_meta_list_legacy(self, tmp_path: Path) -> None: utilities.write_parameter_to_file(f, "save_metadata", True) utilities.write_parameter_to_file(f, "legacy_netcdf", True) f.close() - delta = DeltaModel(input_file=p) + with pytest.warns(UserWarning, match=r".*netcdf file in legacy schema.*"): + delta = DeltaModel(input_file=p) # check things about the metadata assert hasattr(delta, "_save_var_list") assert type(delta._save_var_list) == dict assert "meta" in delta._save_var_list.keys() # save meta on, so check that some expected values are there assert "L0" in delta._save_var_list["meta"].keys() - assert delta._save_var_list["meta"]["L0"] == ["L0", "cells", "i8", ()] + assert delta._save_var_list["meta"]["L0"] == [ + "L0", + "cells", + "i8", + (), + "channel_entrance__length", + ] assert "H_SL" in delta._save_var_list["meta"].keys() assert delta._save_var_list["meta"]["H_SL"] == [ - None, + "H_SL", "meters", "f4", - "total_time", + "time", + "basin_water_surface__elevation", ] def test_netcdf_vars(self, tmp_path: Path) -> None: @@ -1215,16 +1236,29 @@ def test_netcdf_vars(self, tmp_path: Path) -> None: assert "meta" in delta._save_var_list.keys() # save meta on, so check that some expected values are there assert "L0" in delta._save_var_list["meta"].keys() - assert delta._save_var_list["meta"]["L0"] == ["L0", "cells", "i8", ()] + assert delta._save_var_list["meta"]["L0"] == [ + "L0", + "cells", + "i8", + (), + "channel_entrance__length", + ] assert "H_SL" in delta._save_var_list["meta"].keys() - assert delta._save_var_list["meta"]["H_SL"] == [None, "meters", "f4", "time"] + assert delta._save_var_list["meta"]["H_SL"] == [ + "H_SL", + "meters", + "f4", + "seconds", + "basin_water_surface__elevation", + ] # check save var list for eta assert "eta" in delta._save_var_list.keys() assert delta._save_var_list["eta"] == [ "eta", "meters", "f4", - ("time", "x", "y"), + ("seconds", "x", "y"), + "channel_bottom__elevation", ] # force save to netcdf delta.save_grids_and_figs() @@ -1235,17 +1269,17 @@ def test_netcdf_vars(self, tmp_path: Path) -> None: os.path.join(delta.prefix, "pyDeltaRCM_output.nc"), "r+", format="NETCDF4" ) # check for meta group - assert "meta" in data.groups + assert "auxdata" in data.groups # check for L0 a single value metadata - assert "L0" in data["meta"].variables - assert data["meta"]["L0"][0].data == delta.L0 + assert "L0" in data["auxdata"].variables + assert data["auxdata"]["L0"][0].data == delta.L0 # check H_SL a vector of metadata - assert "H_SL" in data["meta"].variables - assert data["meta"]["H_SL"].dimensions == ("time",) - assert data["time"].shape == data["meta"]["H_SL"].shape + assert "H_SL" in data["auxdata"].variables + assert data["auxdata"]["H_SL"].dimensions == ("seconds",) + assert data["seconds"].shape == data["auxdata"]["H_SL"].shape # check on the eta grid assert "eta" in data.variables - assert data["eta"].shape[0] == data["time"].shape[0] + assert data["eta"].shape[0] == data["seconds"].shape[0] assert data["eta"].shape[1] == delta.L assert data["eta"].shape[2] == delta.W @@ -1258,20 +1292,28 @@ def test_netcdf_vars_legacy(self, tmp_path: Path) -> None: utilities.write_parameter_to_file(f, "save_metadata", True) utilities.write_parameter_to_file(f, "legacy_netcdf", True) f.close() - delta = DeltaModel(input_file=p) + with pytest.warns(UserWarning, match=r".*netcdf file in legacy schema.*"): + delta = DeltaModel(input_file=p) # check things about the metadata assert hasattr(delta, "_save_var_list") assert type(delta._save_var_list) == dict assert "meta" in delta._save_var_list.keys() # save meta on, so check that some expected values are there assert "L0" in delta._save_var_list["meta"].keys() - assert delta._save_var_list["meta"]["L0"] == ["L0", "cells", "i8", ()] + assert delta._save_var_list["meta"]["L0"] == [ + "L0", + "cells", + "i8", + (), + "channel_entrance__length", + ] assert "H_SL" in delta._save_var_list["meta"].keys() assert delta._save_var_list["meta"]["H_SL"] == [ - None, + "H_SL", "meters", "f4", - "total_time", + "time", + "basin_water_surface__elevation", ] # check save var list for eta assert "eta" in delta._save_var_list.keys() @@ -1279,7 +1321,8 @@ def test_netcdf_vars_legacy(self, tmp_path: Path) -> None: "eta", "meters", "f4", - ("total_time", "length", "width"), + ("time", "x", "y"), + "channel_bottom__elevation", ] # force save to netcdf delta.save_grids_and_figs() @@ -1296,10 +1339,205 @@ def test_netcdf_vars_legacy(self, tmp_path: Path) -> None: assert data["meta"]["L0"][0].data == delta.L0 # check H_SL a vector of metadata assert "H_SL" in data["meta"].variables - assert data["meta"]["H_SL"].dimensions == ("total_time",) + assert data["meta"]["H_SL"].dimensions == ("time",) assert data["time"].shape == data["meta"]["H_SL"].shape # check on the eta grid assert "eta" in data.variables assert data["eta"].shape[0] == data["time"].shape[0] assert data["eta"].shape[1] == delta.L assert data["eta"].shape[2] == delta.W + + +class TestCustomOutputs: + def test_custom_model_output__meta_scalar(self, tmp_path: Path) -> None: + # test that input metadata can be a dict + file_name = "user_parameters.yaml" + p, f = utilities.create_temporary_file(tmp_path, file_name) + utilities.write_parameter_to_file(f, "out_dir", tmp_path / "out_dir") + utilities.write_parameter_to_file(f, "save_eta_grids", True) + utilities.write_parameter_to_file(f, "save_metadata", True) + f.close() + + class MSCustomSaveModel(DeltaModel): + def __init__(self, input_file=None, **kwargs): + # inherit base DeltaModel methods + super().__init__(input_file, **kwargs) + + def hook_init_output_file(self): + # save number of water parcels w/ a long name AS DICT + self._save_var_list["meta"]["Np_water"] = dict( # key ignored + varname="number_water_parcels", # name in the netcdf file + varvalue="Np_water", # name in the model + varunits="parcels", + vartype="i8", + vardims=(), + varlong="testname", + ) + # save number of sed parcels with long name AS LIST + self._save_var_list["meta"]["sed_parcels"] = [ # key is name in netcdf + "Np_sed", # model var + "parcels", + "i8", + (), + "longname", + ] + + delta = MSCustomSaveModel(input_file=p) + # force save to netcdf + delta.save_grids_and_figs() + # close netcdf + delta.output_netcdf.close() + # check out the netcdf + data = Dataset( + os.path.join(delta.prefix, "pyDeltaRCM_output.nc"), "r+", format="NETCDF4" + ) + # check for custom outputs + assert "meta" in delta._save_var_list.keys() # internal list still called meta + assert "number_water_parcels" in data["auxdata"].variables + assert data["auxdata"]["number_water_parcels"][0].data == delta.Np_water + + assert "sed_parcels" in data["auxdata"].variables + assert data["auxdata"]["sed_parcels"][0].data == delta.Np_sed + + def test_custom_model_output__meta_timeseries(self, tmp_path: Path) -> None: + # test that input metadata can be a dict + file_name = "user_parameters.yaml" + p, f = utilities.create_temporary_file(tmp_path, file_name) + utilities.write_parameter_to_file(f, "out_dir", tmp_path / "out_dir") + utilities.write_parameter_to_file(f, "save_eta_grids", True) + utilities.write_parameter_to_file(f, "save_metadata", True) + f.close() + + class MTCustomSaveModel(DeltaModel): + def __init__(self, input_file=None, **kwargs): + self.input_timevaryingvar_list = 10 + self.input_timevaryingvar_dict = np.zeros( + (100, 200) + ) # HARDCODED TO DEFAULT DIMS!! + # inherit base DeltaModel methods + super().__init__(input_file, **kwargs) + + def hook_init_output_file(self): + # save one as a list with None as arg (slated to deprecate!) + self._save_var_list["meta"]["input_timevaryingvar_list"] = [ + "input_timevaryingvar_list", + "meters", + "f4", + (self._netcdf_coords[0]), + "basin_water_surface__elevation", + ] + + self._save_var_list["meta"]["ignored"] = dict( # name should be ignored + varname="timevaryingnc", # name in the netcdf file + varvalue="input_timevaryingvar_dict", # name in the model + varunits="parcels", + vartype="i8", + vardims=self._netcdf_coords, + varlong="testname", + ) + + delta = MTCustomSaveModel(input_file=p) + # force save to netcdf + delta.save_grids_and_figs() + # close netcdf + delta.output_netcdf.close() + # check out the netcdf + data = Dataset( + os.path.join(delta.prefix, "pyDeltaRCM_output.nc"), "r+", format="NETCDF4" + ) + # check for custom outputs + assert "meta" in delta._save_var_list.keys() # internal list still called meta + assert "input_timevaryingvar_list" in data["auxdata"].variables + assert np.all( + data["auxdata"]["input_timevaryingvar_list"][0].data + == delta.input_timevaryingvar_list + ) + + assert "timevaryingnc" in data["auxdata"].variables + assert np.all( + data["auxdata"]["timevaryingnc"][0].data == delta.input_timevaryingvar_dict + ) + + def test_custom_model_output__meta_timeseries_None_warning( + self, tmp_path: Path + ) -> None: + # test that input metadata can be a dict + file_name = "user_parameters.yaml" + p, f = utilities.create_temporary_file(tmp_path, file_name) + utilities.write_parameter_to_file(f, "out_dir", tmp_path / "out_dir") + utilities.write_parameter_to_file(f, "save_eta_grids", True) + utilities.write_parameter_to_file(f, "save_metadata", True) + f.close() + + class MTNWCustomSaveModel(DeltaModel): + def __init__(self, input_file=None, **kwargs): + self.input_timevaryingvar_list = 10 + self.input_timevaryingvar_dict = np.zeros( + (100, 200) + ) # HARDCODED TO DEFAULT DIMS!! + # inherit base DeltaModel methods + super().__init__(input_file, **kwargs) + + def hook_init_output_file(self): + # save one as a list with None as arg (slated to deprecate!) + self._save_var_list["meta"]["input_timevaryingvar_list"] = [ + None, + "meters", + "f4", + (self._netcdf_coords[0]), + "basin_water_surface__elevation", + ] + + with pytest.warns(UserWarning, match=r"Specifying `None`.*"): + delta = MTNWCustomSaveModel(input_file=p) + # force save to netcdf + + def test_custom_model_output__base_timeseries(self, tmp_path: Path) -> None: + # test that input metadata can be a dict + file_name = "user_parameters.yaml" + p, f = utilities.create_temporary_file(tmp_path, file_name) + utilities.write_parameter_to_file(f, "out_dir", tmp_path / "out_dir") + utilities.write_parameter_to_file(f, "save_eta_grids", True) + utilities.write_parameter_to_file(f, "save_metadata", True) + f.close() + + class BTCustomSaveModel(DeltaModel): + def __init__(self, input_file=None, **kwargs): + # inherit base DeltaModel methods + super().__init__(input_file, **kwargs) + + def hook_init_output_file(self): + # save the active layer grid each save_dt w/ a short name + self._save_var_list["actlay"] = dict( # should be ignored + varname="saved_active_layer", # name in the netcdf file + varvalue="active_layer", # name in the model + varunits="fraction", + vartype="f4", + vardims=("seconds", "x", "y"), + varlong="testname", + ) + + # save one as a list + self._save_var_list["normalized_discharge_x"] = [ # name in netcdf + "qxn", # name in the model + "", + "f4", + ("seconds", "x", "y"), + "normalized discharge in x", + ] + + delta = BTCustomSaveModel(input_file=p) + # force save to netcdf + delta.save_grids_and_figs() + # close netcdf + delta.output_netcdf.close() + # check out the netcdf + data = Dataset( + os.path.join(delta.prefix, "pyDeltaRCM_output.nc"), "r+", format="NETCDF4" + ) + # check for vars + assert "saved_active_layer" in data.variables + assert np.all(data["saved_active_layer"][0].data == delta.active_layer) + + assert "normalized_discharge_x" in data.variables + assert np.all(data["normalized_discharge_x"][0].data == delta.qxn) diff --git a/tests/test_iteration_tools.py b/tests/test_iteration_tools.py index 0f2b244b..9739c082 100644 --- a/tests/test_iteration_tools.py +++ b/tests/test_iteration_tools.py @@ -423,6 +423,39 @@ def test_save_metadata_no_grids(self, tmp_path: Path) -> None: # assertions assert not ("eta" in ds.variables) + assert ds["auxdata"]["H_SL"].shape[0] == 4 # init + 3 + assert ds["auxdata"]["L0"][:] == 3 + + def test_save_metadata_no_grids_legacy(self, tmp_path: Path) -> None: + p = utilities.yaml_from_dict( + tmp_path, + "input.yaml", + {"save_dt": 1, "save_metadata": True, "legacy_netcdf": True}, + ) + with pytest.warns(UserWarning, match=r".*netcdf file in legacy schema.*"): + _delta = DeltaModel(input_file=p) + + # mock the log_info + _delta.log_info = mock.MagicMock() + + # mock the actual output routines + _delta.make_figure = mock.MagicMock() + _delta.save_figure = mock.MagicMock() + _delta.save_grids = mock.MagicMock() + + exp_path_nc = os.path.join(tmp_path / "out_dir", "pyDeltaRCM_output.nc") + assert os.path.isfile(exp_path_nc) + + for _t in range(0, 3): + _delta.save_grids_and_figs() + _delta._save_iter += 1 + + # close the file and connect + _delta.output_netcdf.close() + ds = netCDF4.Dataset(exp_path_nc, "r", format="NETCDF4") + + # assertions LEGACY SUBGROUP SHOULD BE CALLED "meta" + assert not ("eta" in ds.variables) assert ds["meta"]["H_SL"].shape[0] == 4 # init + 3 assert ds["meta"]["L0"][:] == 3 @@ -462,9 +495,9 @@ def test_save_metadata_and_grids(self, tmp_path: Path) -> None: # assertions assert "eta" in ds.variables assert "velocity" in ds.variables - assert ds["meta"]["H_SL"].shape[0] == 4 # init + 3 - assert ds["meta"]["L0"][:] == 3 - assert np.all(ds["meta"]["f_bedload"][:] == 0.25) + assert ds["auxdata"]["H_SL"].shape[0] == 4 # init + 3 + assert ds["auxdata"]["L0"][:] == 3 + assert np.all(ds["auxdata"]["f_bedload"][:] == 0.25) def test_save_one_grid_metadata_by_default(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( @@ -498,6 +531,49 @@ def test_save_one_grid_metadata_by_default(self, tmp_path: Path) -> None: _delta.output_netcdf.close() ds = netCDF4.Dataset(exp_path_nc, "r", format="NETCDF4") + # assertions + _arr = ds.variables["eta"] + assert _arr.shape[1] == _delta.eta.shape[0] + assert _arr.shape[2] == _delta.eta.shape[1] + assert "auxdata" in ds.groups # if any grids, save meta too + assert ds.groups["auxdata"]["H_SL"].shape[0] == _arr.shape[0] + assert np.all(ds.groups["auxdata"]["C0_percent"][:].data == 0.2) + assert np.all(ds.groups["auxdata"]["f_bedload"][:].data == 0.5) + + def test_save_one_grid_metadata_by_default_legacy(self, tmp_path: Path) -> None: + p = utilities.yaml_from_dict( + tmp_path, + "input.yaml", + { + "save_dt": 1, + "save_metadata": False, + "save_eta_grids": True, + "C0_percent": 0.2, + "legacy_netcdf": True, + }, + ) + with pytest.warns(UserWarning, match=r".*netcdf file in legacy schema.*"): + _delta = DeltaModel(input_file=p) + + # mock the log_info + _delta.log_info = mock.MagicMock() + + # mock the actual output routines + _delta.make_figure = mock.MagicMock() + _delta.save_figure = mock.MagicMock() + _delta.save_grids = mock.MagicMock() + + exp_path_nc = os.path.join(tmp_path / "out_dir", "pyDeltaRCM_output.nc") + assert os.path.isfile(exp_path_nc) + + for _t in range(0, 6): + _delta.save_grids_and_figs() + _delta._save_iter += 1 + + # close the file and connect + _delta.output_netcdf.close() + ds = netCDF4.Dataset(exp_path_nc, "r", format="NETCDF4") + # assertions _arr = ds.variables["eta"] assert _arr.shape[1] == _delta.eta.shape[0] @@ -587,7 +663,7 @@ def test_save_eta_grids(self, tmp_path: Path) -> None: _arr = ds.variables["eta"] assert _arr.shape[1] == _delta.eta.shape[0] assert _arr.shape[2] == _delta.eta.shape[1] - assert "meta" in ds.groups # if any grids, save meta too + assert "auxdata" in ds.groups # if any grids, save meta too def test_save_depth_grids(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( @@ -614,7 +690,7 @@ def test_save_depth_grids(self, tmp_path: Path) -> None: _arr = ds.variables["depth"] assert _arr.shape[1] == _delta.depth.shape[0] assert _arr.shape[2] == _delta.depth.shape[1] - assert "meta" in ds.groups # if any grids, save meta too + assert "auxdata" in ds.groups # if any grids, save meta too def test_save_velocity_grids(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( @@ -641,7 +717,7 @@ def test_save_velocity_grids(self, tmp_path: Path) -> None: _arr = ds.variables["velocity"] assert _arr.shape[1] == _delta.uw.shape[0] assert _arr.shape[2] == _delta.uw.shape[1] - assert "meta" in ds.groups # if any grids, save meta too + assert "auxdata" in ds.groups # if any grids, save meta too def test_save_stage_grids(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( @@ -668,7 +744,7 @@ def test_save_stage_grids(self, tmp_path: Path) -> None: _arr = ds.variables["stage"] assert _arr.shape[1] == _delta.stage.shape[0] assert _arr.shape[2] == _delta.stage.shape[1] - assert "meta" in ds.groups # if any grids, save meta too + assert "auxdata" in ds.groups # if any grids, save meta too def test_save_discharge_grids(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( @@ -695,7 +771,7 @@ def test_save_discharge_grids(self, tmp_path: Path) -> None: _arr = ds.variables["discharge"] assert _arr.shape[1] == _delta.qw.shape[0] assert _arr.shape[2] == _delta.qw.shape[1] - assert "meta" in ds.groups # if any grids, save meta too + assert "auxdata" in ds.groups # if any grids, save meta too def test_save_sedflux_grids(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict( @@ -722,7 +798,7 @@ def test_save_sedflux_grids(self, tmp_path: Path) -> None: _arr = ds.variables["sedflux"] assert _arr.shape[1] == _delta.qs.shape[0] assert _arr.shape[2] == _delta.qs.shape[1] - assert "meta" in ds.groups # if any grids, save meta too + assert "auxdata" in ds.groups # if any grids, save meta too def test_save_grids_exception(self, tmp_path: Path) -> None: p = utilities.yaml_from_dict(tmp_path, "input.yaml", {"save_dt": 1})