Commit a81e1c9c authored by Lukas Riedel's avatar Lukas Riedel

Merge branch 'master' into 127-add-data-assimilation-interface-to-richardssimulation

parents 0785d781 99e71e5d
......@@ -98,6 +98,7 @@ build:system-tests: &build-tests
script:
- CMAKE_FLAGS="$CMAKE_FLAGS"
$DUNECONTROL --only=dorie configure
- $DUNECONTROL --only=dorie make $MAKE_FLAGS dorie-rfg
- $DUNECONTROL --only=dorie make $MAKE_FLAGS build_system_tests
- $DUNECONTROL --only=dorie make doc
artifacts:
......@@ -128,6 +129,7 @@ build:debug: &debug
-DCMAKE_BUILD_TYPE=Debug
-DCMAKE_CXX_FLAGS_DEBUG='-Werror'"
$DUNECONTROL --only=dorie configure
- $DUNECONTROL --only=dorie make $MAKE_FLAGS dorie-rfg
- $DUNECONTROL --only=dorie make $MAKE_FLAGS build_unit_tests
build:debug-clang:
......@@ -141,6 +143,7 @@ build:debug-clang:
-DCMAKE_CXX_COMPILER=clang++
-DCMAKE_CXX_FLAGS_DEBUG='-Werror'"
$DUNECONTROL --only=dorie configure
- $DUNECONTROL --only=dorie make $MAKE_FLAGS dorie-rfg
- $DUNECONTROL --only=dorie make $MAKE_FLAGS build_unit_tests
......
......@@ -40,6 +40,7 @@
* Coupling between transient models for water flow and solute transport !96
* Initial conditions generated from H5 input data !130
* Generic Python VTK file reader !143
* Linear interpolator for initial conditions and scaling fields !145
* Parameterizations for hydrodynamic dispersion in solute transport !141
* Generic Python VTK file reader !143, !150
......@@ -145,7 +146,8 @@
* Specifying scaling field `extensions` and `offset` is now optional !133
* Generalized initial condition specification in config file !129
* Structure and setup of Sphinx user docs !126
* Switch to stable `dune-randomfield` release branch !151
* Switch to stable `dune-randomfield` release branch !151, !153
* System tests for executing `dorie pfg` module !153
### Fixed
* Solver in `RichardsSimulation` was using the wrong time variable.
......
......@@ -153,8 +153,8 @@ adding an empty line, make text **bold** or ``monospaced``.
<parameter name="interpolation">
<definition> Interpolation type used for the data (``data`` type only).
</definition>
<values> nearest </values>
<suggestion> nearest </suggestion>
<values> nearest, linear </values>
<suggestion> linear </suggestion>
</parameter>
</category>
......
......@@ -170,7 +170,7 @@ adding an empty line, make text **bold** or ``monospaced``.
<parameter name="interpolation">
<definition> Interpolation type used for the data (``data`` type only).
</definition>
<values> nearest </values>
<values> nearest, linear </values>
<suggestion> nearest </suggestion>
</parameter>
</category>
......
......@@ -33,6 +33,7 @@ assignment and increment are based on the :doc:`public-api`.
manual/bcfile
manual/initial
manual/fluxes
manual/interpolators
public-api
.. toctree::
......
......@@ -14,7 +14,7 @@ long as the respective quantity has a unique transformation to the solver
solution quantity.
Initial condition input is controlled entirely via the
:doc:`Configuration File <man-config-file>`.
:doc:`Configuration File <config-file>`.
.. note::
The initial condition is projected onto the actual solution function space.
......@@ -74,6 +74,8 @@ They are controlled by the ``initial.type`` key and available for every model.
can be chosen using the setting ``initial.interpolation``. The input data
is automatically streched to match the grid extensions.
.. note:: For ``FEorder > 0``, linear interpolation is recommended.
Supported file extensions:
* ``.h5``: H5_ data file. ``initial.dataset`` may be a file-internal path
......
.. _man-interpolators:
Interpolators
=============
Interpolators are used at several points in DORiE to evaluate discrete input
data at every physical location on the grid.
Currently, interpolators are used in the
:doc:`Parameter File <parameter-file>`, and when loading an initial condition
from data with the options given in the
:doc:`Configuration File <config-file>`.
.. note:: The type of interpolator may change how the input data is
interpreted. In particular, different interpolators require
different input dataset shapes for the same grid configuration.
Interpolator Types
------------------
Every interpolator assumes that the input data spans a rectangle (2D) or cuboid
(3D). In case of initial conditions, the respective volume is streched to cover
the entire grid. For scaling fields, users may specify extensions and offset of
the volume, or opt for it to cover the entire grid by omitting the respective
keys in the parameter file.
.. object:: Nearest neighbor interpolator
* ``interpolation: nearest``
Interprets dataset values as *cell values*. No extrapolation is applied.
The lowest supported dataset dimensions are ``(1, 1)`` (single value
everywhere).
Use this interpolator for cell-centered data like
:ref:`scaling field values <man-parameter_scaling>` and initial conditions
for a finite volume solver.
.. tip:: To provide the exact scaling factors for each cell of a grid
of :math:`1 \times 10` cells, use the ``nearest`` interpolator
with a dataset of shape ``(10, 1)`` (dimensions are inverted).
The dataset value are the function values at the cell barycenters.
.. object:: Linear interpolator
* ``interpolation: linear``
Interprets dataset values as *vertex values* and linearaly interpolates
between them. No extrapolation is applied. The lowest supported dataset
dimensions are ``(2, 2)`` (one value in each corner).
Use this interpolator if input data should be interpreted as
continuous functions like initial conditions for a DG solver.
.. tip:: To provide the initial condition for a DG solver with finite
element polynomials of order :math:`k = 1` on a grid of
:math:`1 \times 10` cells, use the ``linear`` interpolator
with a dataset of shape ``(11, 2)`` (dimensions are inverted).
The dataset values are the function values at the grid vertices.
......@@ -54,12 +54,11 @@ of scaling.
Every ``scaling_section`` has the same keys: The H5 filepath is given by
``file``, and the internal dataset path by ``dataset``. You can then choose an
``interpolation`` method for the dataset, which requires you to insert values
``interpolation`` method for the dataset. You may optionally insert values
for the physical ``extensions`` of the dataset and the ``offset`` of its
(front) lower left corner from the origin. The latter two keys are optional.
If at least one of them is omitted, the inserted field is automatically
scaled to match the maximum grid extensions. This also works for irregular grid
shapes.
(front) lower left corner from the origin. If at least one of them is omitted,
the inserted field is automatically scaled to match the maximum grid
extensions. This also works for irregular grid shapes.
.. code-block:: yaml
......@@ -108,9 +107,10 @@ One or more datasets in possibly multiple HDF5_ files are required for creating
a medium with small-scale heterogeneities. The datasets must consist of
floating-point values and must have the same dimensionality as the grid.
Other than that, there is no restriction on their shape because they are
inserted into interpolators. The interpolators supply scaling factor values
based on positions on the grid and thus create a grid-independent
representation. Scaling field input works the same for any supported grid type.
inserted into :doc:`Interpolators <interpolators>`. The interpolators supply
scaling factor values ased on positions on the grid and thus create a
grid-independent representation. Scaling field input works the same for any
supported grid type.
During setup, the program reads the interpolator values at the **barycenter**
of every grid cell on the **coarsest** grid configuration. These values are
......@@ -120,8 +120,7 @@ scaling factors apply in all its child cells.
Supported Parameter File Types
------------------------------
This section lists the available types for parameterizations, scalings, and
interpolators in a understandable format. You can also dig through the code
documentation below, or the code itself, to retrieve this information.
interpolators in a understandable format.
Richards Parameterizations
~~~~~~~~~~~~~~~~~~~~~~~~~~
......@@ -544,17 +543,6 @@ Miller Scaling
scale_miller:
# ...
.. _man-interpolators:
Interpolators
-------------
.. object:: Nearest neighbor interpolator
Interprets dataset values as cell values.
* ``interpolation: nearest``
.. _YAML: https://gettaurus.org/docs/YAMLTutorial/
.. _HDF5: https://www.h5py.org/
// -*- tab-width: 4; indent-tabs-mode: nil -*-
/** \file
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#include "config.h"
#endif
// common includes
#include <random>
#include <fstream>
#include <vector>
#include <fftw3.h>
#include <fftw3-mpi.h>
#include <time.h>
#include <hdf5.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <unistd.h>
#include <assert.h>
#include <sstream>
#include <cstdio>
#include <cerrno>
// DUNE includes
// Do not treat DUNE warnings as errors
#pragma GCC diagnostic push
#pragma GCC diagnostic warning "-Wall"
#include <dune/common/parallel/mpihelper.hh>
#include <dune/common/parametertree.hh>
#include <dune/common/parametertreeparser.hh>
#include <dune/common/fvector.hh>
#include <dune/geometry/type.hh>
#include <dune/grid/common/gridenums.hh>
#include <dune/pdelab/common/geometrywrapper.hh>
#pragma GCC diagnostic pop
#include <dune/common/exceptions.hh>
#include <dune/common/parallel/mpihelper.hh>
// dorie-rfg includes
#include <dune/randomfield/randomfield.hh>
/// Traits for the Random Field
template<unsigned int dimension>
/// Set up dummy traits required by RandomField (usually supplied by grid)
template<int dimension>
struct GridTraits
{
enum {dim = dimension};
using RangeField = double;
using Scalar = Dune::FieldVector<RangeField,1>;
using DomainField = double;
using Domain = Dune::FieldVector<DomainField,dim>;
enum {dim = dimension};
using RangeField = double;
using Scalar = Dune::FieldVector<RangeField, 1>;
using DomainField = double;
using Domain = Dune::FieldVector<DomainField, dim>;
};
int main(int argc, char** argv)
{
try{
//Initialize Mpi
Dune::MPIHelper::instance(argc, argv);
try{
//Initialize Mpi
Dune::MPIHelper::instance(argc, argv);
if (argc!=2)
DUNE_THROW(Dune::IOError,"No parameter file specified!");
const std::string inifilename = argv[1];
if (argc!=2)
DUNE_THROW(Dune::IOError,"No parameter file specified!");
const std::string inifilename = argv[1];
// Read ini file
Dune::ParameterTree inifile;
Dune::ParameterTreeParser ptreeparser;
ptreeparser.readINITree(inifilename, inifile);
const unsigned int dim = inifile.get<unsigned int>("grid.dimensions");
// Read ini file
Dune::ParameterTree inifile;
Dune::ParameterTreeParser ptreeparser;
ptreeparser.readINITree(inifilename, inifile);
const unsigned int dim = inifile.get<unsigned int>("grid.dimensions");
// Attempt to create output directory
const std::string outputPath
= inifile.get<std::string>("general.tempDir");
mkdir(outputPath.c_str(), S_IRWXU | S_IRWXG | S_IROTH | S_IXOTH);
int result = access(outputPath.c_str(), W_OK);
if (result != 0)
DUNE_THROW(Dune::IOError,"Output folder " << outputPath << " not writable");
// Attempt to create output directory
const std::string outputPath
= inifile.get<std::string>("general.tempDir");
const int status = mkdir(outputPath.c_str(),
S_IRWXU | S_IRWXG | S_IROTH | S_IXOTH);
// allow failure because directory exists
if (status != 0 && errno != EEXIST)
DUNE_THROW(Dune::IOError,
"Output folder " << outputPath << " not writable. "
"Error by system: " << std::strerror(errno));
const std::string outputFile = outputPath + "/field";
// fix output filename to ensure the file is found by Python frontend
const std::string outputFile = outputPath + "/field";
// standard values
inifile["stochastic.anisotropy"] = "axiparallel";
// standard values
inifile["stochastic.anisotropy"] = "axiparallel";
// extract seed
const unsigned int seed = inifile.get<unsigned int>("stochastic.seed");
// extract seed
const unsigned int seed = inifile.get<unsigned int>("stochastic.seed");
// Create RFG objects
switch(dim){
case 2:
{
using Traits = GridTraits<2>;
Dune::RandomField::RandomField<Traits,false,false> field(inifile);
field.generate(seed);
field.writeToFile(outputFile);
}
break;
case 3:
{
using Traits = GridTraits<3>;
Dune::RandomField::RandomField<Traits,false,false> field(inifile);
field.generate(seed);
field.writeToFile(outputFile);
}
break;
default:
DUNE_THROW(Dune::NotImplemented,"Only 2 and 3-dimensional fields are supported");
}
// Create RFG objects
if (dim == 2) {
using Traits = GridTraits<2>;
Dune::RandomField::RandomField<Traits> field(inifile);
field.generate(seed);
field.writeToFile(outputFile);
}
else if (dim == 3) {
using Traits = GridTraits<3>;
Dune::RandomField::RandomField<Traits> field(inifile);
field.generate(seed);
field.writeToFile(outputFile);
}
else {
DUNE_THROW(Dune::NotImplemented,
"Only 2 and 3-dimensional fields are supported");
}
return 0;
}
return 0;
}
catch (Dune::Exception &e)
{
std::cerr << "Dune reported error: " << e << std::endl;
return 1;
}
catch (...)
{
std::cerr << "Unknown exception thrown!" << std::endl;
throw;
return 1;
}
catch (Dune::Exception &e)
{
std::cerr << "Dune reported error: " << e << std::endl;
return 1;
}
catch (std::exception& e)
{
std::cerr << "Exception occurred: " << e.what() << std::endl;
return 1;
}
catch (...)
{
std::cerr << "Unknown exception thrown!" << std::endl;
throw;
return 1;
}
}
......@@ -61,7 +61,7 @@ public:
}
private:
std::shared_ptr<Interpolator<RF, T>> _interpolator;
std::shared_ptr<Interpolator<RF, dim>> _interpolator;
};
} // namespace Dorie
......
......@@ -7,6 +7,8 @@
#include <utility>
#include <numeric>
#include <algorithm>
#include <string>
#include <string_view>
#include <dune/common/fvector.hh>
#include <dune/common/exceptions.hh>
......@@ -28,14 +30,16 @@ namespace Dorie {
* \author Lukas Riedel
* \date 2018
*/
template<typename T, typename Traits>
template<typename DataType, int dim>
class Interpolator
{
protected:
using Domain = typename Traits::Domain;
static constexpr int dim = Traits::dim;
/// Coordinate Type
using DF = double;
/// Physical vector type
using Domain = Dune::FieldVector<DF, dim>;
const std::vector<T> _data; //!< contiguously stored data of the array
const std::vector<DataType> _data; //!< contiguously stored data of the array
const std::vector<size_t> _shape; //!< inverted shape of the original array
//! physical extensions of the dataset
const Domain _extensions;
......@@ -100,7 +104,7 @@ public:
/** \param pos Position to evaluate
* \return Value of the interpolated function
*/
virtual T evaluate (const Domain& pos) const = 0;
virtual DataType evaluate (const Domain& pos) const = 0;
};
/// A nearest-neighbor interpolator in 2D and 3D
......@@ -110,34 +114,25 @@ public:
*
* \ingroup Interpolators
*/
template<typename T, typename Traits>
class NearestNeighborInterpolator: public Interpolator<T, Traits>
template<typename DataType, int dim>
class NearestNeighborInterpolator: public Interpolator<DataType, dim>
{
private:
using Base = Interpolator<T, Traits>;
using Base = Interpolator<DataType, dim>;
using Domain = typename Base::Domain;
static constexpr int dim = Base::dim;
using DF = typename Traits::DF;
using DF = typename Base::DF;
static constexpr DF eps = 1e-9;
public:
template<typename Data, typename Shape, typename Domain1, typename Domain2>
NearestNeighborInterpolator (Data&& data,
Shape&& shape,
Domain1&& extensions,
Domain2&& offset) :
Base(std::forward<Data>(data),
std::forward<Shape>(shape),
std::forward<Domain1>(extensions),
std::forward<Domain2>(offset))
{ }
/// Re-use the base class constructor
using Base::Base;
/// Export the type of this intepolator
static inline std::string type = "nearest";
/// Evaluate the interpolator
T evaluate (const Domain& x) const override
DataType evaluate (const Domain& x) const override
{
const auto pos = this->sub_offset(x);
if (not this->inside_extensions(pos)) {
......@@ -168,6 +163,158 @@ public:
~NearestNeighborInterpolator() override = default;
};
/// Linear interpolator for 2D and 3D.
/** The implementation is based on articles
* https://en.wikipedia.org/wiki/Bilinear_interpolation
* and https://en.wikipedia.org/wiki/Trilinear_interpolation
*
* \ingroup Interpolators
* \author Lukas Riedel
* \date 2019
*/
template<typename DataType, int dim>
class LinearInterpolator : public Interpolator<DataType, dim>
{
private:
using Base = Interpolator<DataType, dim>;
using Domain = typename Base::Domain;
using DF = typename Base::DF;
using MultiIdx = std::array<size_t, dim>;
/// Get the position vector in mesh units
/** \param pos Global position vector
* \return Position vector in mesh units
*/
Domain get_pos_unit (const Domain& pos) const
{
Domain pos_unit;
for (size_t i = 0; i < pos.size(); ++i) {
pos_unit[i] = pos[i] * (this->_shape[i] - 1)
/ this->_extensions[i];
}
return pos_unit;
}
/// Return the position difference vector in mesh units
/** The difference is calculated by using the (front) lower left corner
* as origin.
* \param pos_unit Position vector in mesh units, see get_pos_unit()
* \pararm indices The multi index of (front) lower left corner
* \return Position difference vector in mesh units
*/
Domain get_pos_unit_diff (const Domain& pos_unit, const MultiIdx& indices)
const
{
Domain pos_diff;
for (size_t i = 0; i < pos_unit.size(); ++i) {
pos_diff[i] = pos_unit[i] - indices[i];
}
return pos_diff;
}
/// Get the indices for accessing the data knots
/** \param pos_unit The position in mesh units
* \return Indices in every dimension of the mesh
*/
std::pair<MultiIdx, MultiIdx> get_knot_indices (
const Domain& pos_unit
) const
{
// round mesh unit positions up and down
MultiIdx idx_lower, idx_upper;
for (size_t i = 0; i < idx_lower.size(); ++i) {
idx_lower[i] = std::max(std::floor(pos_unit[i]), 0.0);
idx_upper[i] = std::min<size_t>(std::ceil(pos_unit[i]),
this->_shape[i]-1);
}
return std::make_pair(idx_lower, idx_upper);
}
/// Transform a multi index into a index for accessing the data array
/** \param indices The multi index to transform
* \return The index to query the data set #_data
*/
size_t to_index (const MultiIdx& indices) const
{
size_t index = indices[0] + indices[1] * this->_shape[0];
if constexpr (dim == 3) {
index += indices[2] * this->_shape[1] * this->_shape[0];
}
return index;
}
public:
/// Export the type of this intepolator
static inline constexpr std::string_view type = "linear";
/// Re-use the base class constructor
using Base::Base;
/// Delete this interpolator
~LinearInterpolator () override = default;
/// Evaluate the data at a global position
DataType evaluate (const Domain& x) const override
{
const auto pos = this->sub_offset(x);
if (not this->inside_extensions(pos)) {
DUNE_THROW(Exception, "Querying interpolator data outside its "
"extensions!");
}
// compute indices and position vectors in mesh units
const auto pos_unit = get_pos_unit(pos);
const auto [idx_lower, idx_upper] = get_knot_indices(pos);
const auto pos_unit_diff = get_pos_unit_diff(pos_unit, idx_lower);
const auto& data = this->_data;
DataType c_00, c_01, c_10, c_11;
DF y_d, z_d;
// actual computation
if constexpr (dim == 2) {
c_00 = data[to_index(idx_lower)];
c_01 = data[to_index({idx_lower[0], idx_upper[1]})];
c_10 = data[to_index({idx_upper[0], idx_lower[1]})];
c_11 = data[to_index(idx_upper)];
// use y, z here for common computation after if-clause
y_d = pos_unit_diff[0]; // actually x
z_d = pos_unit_diff[1]; // actually y
}
else {
DataType c_000, c_001, c_010, c_100, c_011, c_110, c_101, c_111;
c_000 = data[to_index(idx_lower)];
c_100 = data[to_index({idx_upper[0], idx_lower[1], idx_lower[2]})];
c_010 = data[to_index({idx_lower[0], idx_upper[1], idx_lower[2]})];
c_001 = data[to_index({idx_lower[0], idx_lower[1], idx_upper[2]})];
c_011 = data[to_index({idx_lower[0], idx_upper[1], idx_upper[2]})];
c_110 = data[to_index({idx_upper[0], idx_upper[1], idx_lower[2]})];
c_101 = data[to_index({idx_upper[0], idx_lower[1], idx_upper[2]})];
c_111 = data[to_index(idx_upper)];
DF x_d = pos_unit_diff[0];
y_d = pos_unit_diff[1];
z_d = pos_unit_diff[2];
c_00 = c_000 * (1 - x_d) + c_100 * x_d;
c_01 = c_001 * (1 - x_d) + c_101 * x_d;
c_10 = c_010 * (1 - x_d) + c_110 * x_d;
c_11 = c_011 * (1 - x_d) + c_111 * x_d;
}
const DataType c_0 = c_00 * (1 - y_d) + c_10