Commit ebbc64b0 authored by Lukas Riedel's avatar Lukas Riedel

Merge branch '133-implement-a-class-for-initial-conditions-from-hdf5-data' into 'master'

Resolve "Implement a class for initial conditions from HDF5 data"

Closes #133

See merge request !130
parents a1afde48 53608873
......@@ -38,6 +38,7 @@
* Initial conditions expressed as analytic functions using
[muparser](http://beltoforion.de/article.php?a=muparser) !131
* Coupling between transient models for water flow and solute transport !96
* Initial conditions generated from H5 input data !130
### Changed
* `Simulation` is renamed `RichardsSimulation` and moved to
......
......@@ -92,6 +92,9 @@ exclude_patterns = []
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = None
# Include todo-directives into the output
todo_include_todos = True
# -- Options for HTML output ----------------------------------------------
......
Restart
=======
.. warning:: This document is work in progress (WIP).
Currently, DORiE does not have an option to restart simulations. However, it
is possible to make a pseudo-restart from a past simulation using
:doc:`initial conditions<initial>` from data files.
Pseudo-restart
--------------
Since this is not a proper restart, it is necessary to setup the simulation so
that it will start a new one from the data of the last simulation. This
includes to modify the starting time and the initial condition.
* ``time.start``:
Set the last time written by the restarted simulation.
* Initial condition:
It is not possible to use VTK files directly as initial condition, however
it is possible to transfer such data into HDF5 file which are accepted by
DORiE. A possible script to make such transfer is the following one:
.. todo:: create python script to write vtk data into h5 files.
* ``output.outputPath`` and ``output.fileName``:
Output path and filename must be changed from last simulation, otherwise,
data will be overwritten.
* Merge pvd files:
The old simulation and the pseudo-reset simulation will output different
pdv files because they are independent. However, merging pdv files so that
they are shown in paraview as one simulation is an easy task.
1. Open both pdv files and identify the data sets corresponding to each
time-step.
2. Copy the data sets into a new pdv file.
Output file from the first simulation:
.. code-block:: xml
<?xml version="1.0"?>
<VTKFile type="Collection" version="0.1" byte_order="LittleEndian">
<Collection>
<DataSet timestep="0" group="" part="0" name="" file="old_simulation/old_simulation-00000.vtu"/>
</Collection>
</VTKFile>
Output file from the second simulation:
.. code-block:: xml
<?xml version="1.0"?>
<VTKFile type="Collection" version="0.1" byte_order="LittleEndian">
<Collection>
<DataSet timestep="100" group="" part="0" name="" file="restarted/restarted_simulation-00000.vtu"/>
</Collection>
</VTKFile>
Merged file:
.. code-block:: xml
<?xml version="1.0"?>
<VTKFile type="Collection" version="0.1" byte_order="LittleEndian">
<Collection>
<DataSet timestep="0" group="" part="0" name="" file="old_simulation/old_simulation-00000.vtu"/>
<DataSet timestep="100" group="" part="0" name="" file="restarted/restarted_simulation-00000.vtu"/>
</Collection>
</VTKFile>
Notice that this qualifies as a pseudo-restart because the degrees of freedom
of the last simulation are not completely recovered. Indeed, they are
degenerated! Hence, strictly speaking, a pseudo-restart will lead to different
results compared with respect to a one-run simulation.
......@@ -113,7 +113,9 @@ adding an empty line, make text **bold** or ``monospaced``.
<category name="initial">
<parameter name="type">
<definition> There are currently two possible initial conditions to choose from: A HDF datafile (``data``), or analytic equations (``analytic``).</definition>
<definition> The data type representing the initial condition. Either an
HDF datafile (``data``), or analytic equations (``analytic``).
</definition>
<values> data, analytic </values>
<suggestion> analytic </suggestion>
<comment> Choose initial condition type: data, analytic </comment>
......@@ -134,18 +136,26 @@ adding an empty line, make text **bold** or ``monospaced``.
<suggestion> -h </suggestion>
</parameter>
<parameter name="datafile">
<definition> Path to the initial condition data file.
(``data`` type only)
<parameter name="file">
<definition> Path to the initial condition data file
(``data`` type only). DORiE currently only supports H5 files with
file extension ``.h5``.
</definition>
<values> path </values>
</parameter>
<parameter name="dataset">
<definition> Dataset to use as initial condition. (``data`` type only)
<definition> Dataset to use as initial condition (``data`` type only).
</definition>
<values> string </values>
</parameter>
<parameter name="interpolation">
<definition> Interpolation type used for the data (``data`` type only).
</definition>
<values> nearest </values>
<suggestion> nearest </suggestion>
</parameter>
</category>
<category name="time">
......
......@@ -113,7 +113,9 @@ adding an empty line, make text **bold** or ``monospaced``.
<category name="initial">
<parameter name="type">
<definition> There are currently two possible initial conditions to choose from: A HDF datafile (``data``), or analytic equations (``analytic``).</definition>
<definition> The data type representing the initial condition. Either an
HDF datafile (``data``), or analytic equations (``analytic``).
</definition>
<values> data, analytic </values>
<suggestion> analytic </suggestion>
<comment> Choose initial condition type: data, analytic </comment>
......@@ -134,18 +136,26 @@ adding an empty line, make text **bold** or ``monospaced``.
<suggestion> 0 </suggestion>
</parameter>
<parameter name="datafile">
<definition> Path to the initial condition data file.
(``data`` type only)
<parameter name="file">
<definition> Path to the initial condition data file
(``data`` type only). DORiE currently only supports H5 files with
file extension ``.h5``.
</definition>
<values> path </values>
</parameter>
<parameter name="dataset">
<definition> Dataset to use as initial condition. (``data`` type only)
<definition> Dataset to use as initial condition (``data`` type only).
</definition>
<values> string </values>
</parameter>
<parameter name="interpolation">
<definition> Interpolation type used for the data (``data`` type only).
</definition>
<values> nearest </values>
<suggestion> nearest </suggestion>
</parameter>
</category>
<category name="time">
......
......@@ -40,6 +40,7 @@ assignment and increment are based on the :doc:`public-api`.
:caption: Cook Book
cookbook/index
cookbook/restart
.. toctree::
:maxdepth: 1
......
......@@ -16,6 +16,12 @@ solution quantity.
Initial condition input is controlled entirely via the
:doc:`Configuration File <man-config-file>`.
.. note::
The initial condition is projected onto the actual solution function space.
Depending on grid shape and resolution, function space (order), and
interpolator (if applicable), the resulting solution may vary greatly from
the actual input data.
Input Types
-----------
This is an overview of all input types for initial conditions.
......@@ -57,6 +63,22 @@ They are controlled by the ``initial.type`` key and available for every model.
initial.equation = exp(-sqrt((x-0.5)^2+(y-0.5)^2)/(4.*0.002))/(4*pi*0.002)^(2/dim).
.. object:: Dataset
* ``type = data``
Load the initial condition from a data file ``initial.file`` by opening the
dataset ``initial.dataset`` inside this file. The data is interpreted as
function :math:`f(\mathbf{p})` of the physical position :math:`\mathbf{p}`
using one of the available :ref:`interpolators <man-interpolators>`, which
can be chosen using the setting ``initial.interpolation``. The input data
is automatically streched to match the grid extensions.
Supported file extensions:
* ``.h5``: H5_ data file. ``initial.dataset`` may be a file-internal path
to the target dataset.
.. _initial-transformation:
Transformation Types
......@@ -72,11 +94,11 @@ Initial condition tranformations for the Richards solver.
* ``quantity = matricHead``
The input data is directly interpreted as matric head,
The input data is directly interpreted as matric head
:math:`f = h_m [\mathrm{m}]`.
Transport
^^^^^^^^
^^^^^^^^^
Initial condition tranformations for the Transport solver.
.. object:: No Transformation
......@@ -86,5 +108,5 @@ Initial condition tranformations for the Transport solver.
The input data is directly interpreted as solute concentration,
:math:`f = c_w [\mathrm{kg}/\mathrm{m}^3]`.
.. _H5: https://www.h5py.org/
.. _muparser: http://beltoforion.de/article.php?a=muparser&p=features
......@@ -203,6 +203,8 @@ coordinates in the respective spatial dimensions.
scale_miller:
# ...
.. _man-interpolators:
Interpolators
^^^^^^^^^^^^^
......
......@@ -8,6 +8,7 @@
#include <dune/dorie/common/initial_condition/analytic.hh>
#include <dune/dorie/common/logging.hh>
#include <dune/dorie/common/initial_condition/h5.hh>
namespace Dune{
namespace Dorie{
......@@ -30,7 +31,7 @@ protected:
static auto create(
const Dune::ParameterTree& ini_file,
const typename T::GV& grid_view,
const std::shared_ptr<spdlog::logger> log=get_logger(log_base)
std::shared_ptr<spdlog::logger> log=get_logger(log_base)
)
{
std::unique_ptr<InitialCondition<T>> ic;
......@@ -40,14 +41,13 @@ protected:
if (ic_type == "data")
{
const auto ic_datafile = ini_file.get<std::string>("initial.datafile");
const auto ic_dataset = ini_file.get<std::string>("initial.dataset");
const auto ic_datafile = ini_file.get<std::string>("initial.file");
// determine data type
const auto ext_start = ic_datafile.find_last_of(".");
if (ext_start == std::string::npos) {
log->error("Cannot determine extension of initial condition "
"datafile: {}",
"data file: {}",
ic_datafile);
DUNE_THROW(IOError, "Initial condition datafile has no apparent "
"extension");
......@@ -55,8 +55,9 @@ protected:
const auto file_type = ic_datafile.substr(ext_start + 1);
if (file_type == "h5") {
DUNE_THROW(NotImplemented,
"Initial condition from HDF data files is not implemented yet!");
using ICH5 = InitialConditionH5<T>;
ic = std::make_unique<ICH5>(grid_view, ini_file, log);
} else if (file_type == "vtu") {
DUNE_THROW(NotImplemented,
"Initial condition from VTK data files is not implemented yet!");
......@@ -64,7 +65,7 @@ protected:
log->error("Unsupported initial condition datafile type: .{}",
file_type);
DUNE_THROW(NotImplemented,
"Unsupported initial condition datafile type");
"Unsupported initial condition data file type");
}
} // ic_type == "data"
else if (ic_type == "analytic")
......
// -*- tab-width: 4; indent-tabs-mode: nil -*-
#ifndef DUNE_DORIE_INITIAL_CONDITION_H5_HH
#define DUNE_DORIE_INITIAL_CONDITION_H5_HH
#include <string>
#include <dune/common/exceptions.hh>
#include <dune/common/parametertree.hh>
#include <dune/dorie/common/interpolator.hh>
#include <dune/dorie/common/initial_condition/initial_condition.hh>
namespace Dune{
namespace Dorie{
/*-------------------------------------------------------------------------*//**
* @brief Class for initial conditions from hdf5 data files.
*
* @ingroup InitialConditions
* @author Santiago Ospina
* @date 2019
*
* @tparam T BaseTraits
*/
template<class T>
class InitialConditionH5
: public InitialCondition<T>
{
using Base = InitialCondition<T>;
using RF = typename T::RF;
using Domain = typename T::Domain;
using GV = typename T::GV;
static constexpr int dim = T::dim;
public:
using Traits = typename Base::Traits;
InitialConditionH5( const GV& grid_view,
const Dune::ParameterTree& config,
std::shared_ptr<spdlog::logger> log)
: Base(grid_view),
_interpolator(InterpolatorFactory<T>::create(config.sub("initial"),
grid_view,
log))
{ }
/// Evaluate the initial condition at certain position
/** \param y
* \param x Position in local element coordinates
* \param e Element enitity
*/
void evaluate ( const typename Traits::ElementType& e,
const typename Traits::DomainType& x,
typename Traits::RangeType& y) const override
{
auto x_global = e.geometry().global(x);
y = _interpolator->evaluate(x_global);
}
private:
std::shared_ptr<Interpolator<RF, T>> _interpolator;
};
} // namespace Dorie
} // namespace Dune
#endif // DUNE_DORIE_INITIAL_CONDITION_H5_HH
......@@ -10,9 +10,13 @@
#include <dune/common/fvector.hh>
#include <dune/common/exceptions.hh>
#include <dune/common/parametertree.hh>
#include <dune/dorie/common/logging.hh>
#include <dune/dorie/common/utility.hh>
#include <dune/dorie/common/h5file.hh>
#include <yaml-cpp/yaml.h>
namespace Dune {
namespace Dorie {
......@@ -33,7 +37,7 @@ protected:
const std::vector<T> _data; //!< contiguously stored data of the array
const std::vector<size_t> _shape; //!< inverted shape of the original array
//! phsical extensions of the dataset
//! physical extensions of the dataset
const Domain _extensions;
//! origin offset of the dataset
const Domain _offset;
......@@ -175,6 +179,9 @@ template<typename Traits>
struct InterpolatorFactory
{
using RF = typename Traits::RF;
using Domain = typename Traits::Domain;
/// Create the interpolator.
/** Use perfect forwarding for the data.
* \param type The type of interpolator to use
......@@ -254,6 +261,135 @@ static auto create (
log);
}
/// Create an interpolator from a ini file config
/** \param config Dune config file tree
* \param grid_view The grid view to use for determining extensions
* \param log The logger for the created object
*/
template<typename GridView>
static auto create (
const Dune::ParameterTree& config,
const GridView& grid_view,
const std::shared_ptr<spdlog::logger> log=get_logger(log_base)
)
-> std::shared_ptr<Interpolator<RF, Traits>>
{
const auto filename = config.get<std::string>("file");
const auto dataset = config.get<std::string>("dataset");
const auto [data, shape] = read_h5_dataset(filename, dataset, log);
const auto interpolation = config.get<std::string>("interpolation");
return create(interpolation,
data,
shape,
grid_view,
log);
}
/// Create an interpolator from a YAML config node
/// Create an interpolator from a ini file config
/** \param cfg YAML config tree node
* \param node_name Name of the YAML config node for better error messages
* \param grid_view The grid view to use for determining extensions
* \param log The logger for the created object
*/
template<typename GridView>
static auto create (
const YAML::Node& cfg,
const std::string& node_name,
const GridView& grid_view,
const std::shared_ptr<spdlog::logger> log=get_logger(log_base)
)
-> std::shared_ptr<Interpolator<RF, Traits>>
{
log->trace("Creating interpolator from YAML cfg node: {}", node_name);
const auto dim = GridView::Grid::dimension;
// open the H5 file
const auto filename = cfg["file"].as<std::string>();
const auto dataset = cfg["dataset"].as<std::string>();
const auto [data, shape] = read_h5_dataset(filename, dataset, log);
// read extensions and offset
const auto cfg_ext = cfg["extensions"];
const auto cfg_off = cfg["offset"];
if (cfg_ext and cfg_off)
{
const auto ext = cfg_ext.as<std::vector<RF>>();
const auto off = cfg_off.as<std::vector<RF>>();
if (shape.size() != dim) {
log->error("Expected {}-dimensional dataset. File: {}, "
"Dataset: {}, Dimensions: {}",
dim, filename, dataset, shape.size());
DUNE_THROW(Dune::IOError, "Invalid dataset for interpolator");
}
if (ext.size() != dim) {
log->error("Expected {}-dimensional sequence as "
"'extensions' at scaling section node: {}",
dim, node_name);
DUNE_THROW(Dune::IOError, "Invalid interpolator input");
}
if (off.size() != dim) {
log->error("Expected {}-dimensional sequence as 'offset' "
"at scaling section node: {}",
dim, node_name);
DUNE_THROW(Dune::IOError, "Invalid interpolator input");
}
// copy into correct data structure
Domain extensions;
std::copy(begin(ext), end(ext), extensions.begin());
Domain offset;
std::copy(begin(off), end(off), offset.begin());
// call other factory function
return create(
cfg["interpolation"].template as<std::string>(),
data,
shape,
extensions,
offset,
log
);
}
// extensions or offset not given
else
{
// call other factory function
return create(
cfg["interpolation"].template as<std::string>(),
data,
shape,
grid_view,
log
);
}
}
private:
/// Read a dataset from an H5 file
/** \param file_path Path to the H5 file
* \param dataset_name Name of the dataset
* \return Pair: 1D data, dataset shape.
*/
static std::pair<std::vector<RF>, std::vector<size_t>> read_h5_dataset (
const std::string& file_path,
const std::string& dataset_name,
const std::shared_ptr<spdlog::logger> log
)
{
// open file
H5File h5file(file_path, log);
// read the dataset
std::vector<RF> data;
std::vector<size_t> shape;
h5file.read_dataset(dataset_name, H5T_NATIVE_DOUBLE, data, shape);
std::reverse(begin(shape), end(shape));
return std::make_pair(data, shape);
}
};
} // namespace Dorie
......
......@@ -90,78 +90,11 @@ protected:
*/
void add_interpolator (const YAML::Node& cfg, const std::string node_name)
{
_log->trace("Adding interpolator from data node: {}", node_name);
const auto dim = Traits::dim;
// open the H5 file
const auto filename = cfg["file"].as<std::string>();
const auto dataset = cfg["dataset"].as<std::string>();
H5File h5file(filename, _log);
// read the dataset
std::vector<RF> data;
std::vector<size_t> shape;
h5file.read_dataset(dataset, H5T_NATIVE_DOUBLE, data, shape);
std::reverse(begin(shape), end(shape));
// read extensions and offset
const auto cfg_ext = cfg["extensions"];
const auto cfg_off = cfg["offset"];
if (cfg_ext and cfg_off)
{
const auto ext = cfg_ext.as<std::vector<RF>>();
const auto off = cfg_off.as<std::vector<RF>>();
if (shape.size() != dim) {
_log->error("Expected {}-dimensional dataset. File: {}, "
"Dataset: {}, Dimensions: {}",
dim, filename, dataset, shape.size());
DUNE_THROW(Dune::IOError, "Invalid dataset for interpolator");
}
if (ext.size() != dim) {
_log->error("Expected {}-dimensional sequence as "
"'extensions' at scaling section node: {}",
dim, node_name);
DUNE_THROW(Dune::IOError, "Invalid interpolator input");
}
if (off.size() != dim) {
_log->error("Expected {}-dimensional sequence as 'offset' "
"at scaling section node: {}",
dim, node_name);
DUNE_THROW(Dune::IOError, "Invalid interpolator input");
}
// copy into correct data structure
Domain extensions;
std::copy(begin(ext), end(ext), extensions.begin());
Domain offset;
std::copy(begin(off), end(off), offset.begin());
// build interpolator
this->_interpolators.emplace_back(
InterpolatorFactory<Traits>::create(
cfg["interpolation"].template as<std::string>(),
data,
shape,
extensions,
offset,
_log
)
);
}
// extensions or offset not given
else
{
// build interpolator
this->_interpolators.emplace_back(
InterpolatorFactory<Traits>::create(
cfg["interpolation"].template as<std::string>(),
data,
shape,
_grid_view,
_log
)
);
}
this->_interpolators.emplace_back(
InterpolatorFactory<Traits>::create(cfg,
node_name,
_grid_view,
_log));
}
};
......
......@@ -75,6 +75,15 @@ void test_ic_base (const std::shared_ptr<typename T::Grid> grid,
};
test_ic_against_function(grid, ic, test_f);
}
else if (type == "file") {
// define testing function
auto test_f = [](const auto e, const auto x){
using Range = typename T::Scalar;
Range y = 2.0;
return y;
};
test_ic_against_function(grid, ic, test_f);
}
}
int main (int argc, char** argv)
......
include ${CMAKE_BINARY_DIR}/doc/default_files/config.ini
__name = ic
_initial_type = analytic, data | expand
__name = {_initial_type}
_asset_path = "${PROJECT_SOURCE_DIR}/test"
[grid]
......@@ -14,11 +15,21 @@ extensions = 1 1
file = none
[richards.initial]
type = analytic
type = {_initial_type}
quantity = matricHead
file = ${CMAKE_CURRENT_LIST_DIR}/scaling.h5
dataset = twos
interpolation = nearest
equation = -1+2*y
[transport.initial]
type = analytic
type = {_initial_type}
quantity = soluteConcentration
equation = -1+2*y
\ No newline at end of file
equation = -1+2*y
file = ${CMAKE_CURRENT_LIST_DIR}/scaling.h5
dataset = twos
interpolation = nearest
......@@ -21,6 +21,7 @@ struct InterpolatorTraits
static constexpr int dim = dimension;
using Domain = std::vector<double>;
using DF = double;
using RF = double;
};
/// Test the nearest neighbor interpolator.
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment