Commit fbb7f2fa authored by Lukas Riedel's avatar Lukas Riedel

Merge branch '92-add-a-logging-framework-for-command-line-output' into 'master'

Resolve "Add a logging framework for command line output"

Closes #92

See merge request dorie/dorie!106
parents 5819493f 00b67f30
......@@ -10,6 +10,8 @@ variables:
CMAKE_FLAGS:
-DDUNE_PYTHON_VIRTUALENV_SETUP=True
-DDUNE_PYTHON_ALLOW_GET_PIP=True
# Recursively load submodules by default
GIT_SUBMODULE_STRATEGY: recursive
MAKE_FLAGS:
-j $CPUS_MULTICORE
RUN_IN_DUNE_ENV: $CI_PROJECT_DIR/build-cmake/run-in-dune-env
......@@ -111,7 +113,7 @@ build:debug: &debug
script:
- CMAKE_FLAGS="
$CMAKE_FLAGS
-DCMAKE_BUILD_TYPE=Debug
-DCMAKE_BUILD_TYPE=Debug
-DCMAKE_CXX_FLAGS_DEBUG='-Werror'"
$DUNECONTROL --only=dorie configure
- $DUNECONTROL --only=dorie make $MAKE_FLAGS build_unit_tests
......@@ -122,7 +124,7 @@ build:debug-clang:
script:
- CMAKE_FLAGS="
$CMAKE_FLAGS
-DCMAKE_BUILD_TYPE=Debug
-DCMAKE_BUILD_TYPE=Debug
-DCMAKE_C_COMPILER=clang
-DCMAKE_CXX_COMPILER=clang++
-DCMAKE_CXX_FLAGS_DEBUG='-Werror'"
......@@ -161,12 +163,14 @@ test:unit-tests:
- ./run-in-dune-env gcovr
--root ../
--exclude ../dune/dorie/test
--exclude ../plugins
--html --html-details
-o dune/dorie/test/coverage/coverage.html
# run again for coverage report in GitLab
- ./run-in-dune-env gcovr
--root ../
--exclude ../dune/dorie/test
--exclude ../plugins
coverage: '/^TOTAL.*\s+(\d+\%)$/'
artifacts:
name: "$CI_JOB_NAME"
......@@ -226,4 +230,4 @@ deploy:sphinx-docs:
-P $CI_PROJECT_DIR/build-cmake/doc/html
environment:
name: sphinx-docs
url: https://dorie-doc.netlify.com/
\ No newline at end of file
url: https://dorie-doc.netlify.com/
[submodule "plugins/vendor/spdlog"]
path = plugins/vendor/spdlog
url = https://github.com/gabime/spdlog.git
......@@ -24,6 +24,7 @@
which solves the transport equation with a finite volume scheme for give
grid functions of water content and velocity.
* Infrastructure for the input of Miller scaling fields. !110
* Add logging framework 'spdlog' as submodule !106
### Changed
* `Simulation` is renamed `RichardsSimulation` and moved to
......@@ -115,6 +116,8 @@
It dumps its data into a YAML file which is then loaded for writing
the default configuration files and the cheat sheet.
* Use logging framework 'spdlog' for all terminal output !106
### Fixed
* Solver in `RichardsSimulation` was using the wrong time variable.
[!116](https://ts-gitlab.iup.uni-heidelberg.de/dorie/dorie/merge_requests/116)
......
......@@ -41,6 +41,7 @@ dune_enable_all_packages()
dune_require_cxx_standard(MODULE "dorie" VERSION 14)
# add subdirectories
add_subdirectory(plugins/vendor)
add_subdirectory("m4")
add_subdirectory("cmake/modules")
add_subdirectory("python")
......
......@@ -73,6 +73,7 @@ by CI tests.
| MPI | | Tested with OpenMPI 2.1.1
| SuperLU | 5.2 |
| [yaml-cpp](https://github.com/jbeder/yaml-cpp) | >= 5.2.0 |
| [spdlog](https://github.com/gabime/spdlog) | 1.1.0 | Included as Git Submodule
| [dune-common](https://gitlab.dune-project.org/core/dune-common) | releases/2.6
| [dune-geometry](https://gitlab.dune-project.org/core/dune-geometry) | releases/2.6
| [dune-grid](https://gitlab.dune-project.org/core/dune-grid) | releases/2.6
......@@ -132,7 +133,17 @@ If you installed [Anaconda](https://conda.io/docs/user-guide/install/download.ht
**Parallel runs without these two packages are possible but not supported!**
4. Clone the DUNE modules and DORiE into a suitable folder on your machine. Use `git checkout` to switch to the correct branches (see above). Enter the parent folder, and call
4. Clone the DUNE modules into a suitable folder on your machine.
Use `git checkout` to switch to the correct branches (see above).
5. Clone DORiE into the same folder.
DORiE includes
[Git Submodules](https://git-scm.com/book/en/v2/Git-Tools-Submodules),
which requires you to add the `--recurse-submodules` option to the
`git clone` command. Switch to the desired branch or tag.
6. Enter the parent folder, and call
CMAKE_FLAGS="-DDUNE_PYTHON_VIRTUALENV_SETUP=True -DDUNE_PYTHON_ALLOW_GET_PIP=True" ./dune-common/bin/dunecontrol all
......
......@@ -103,9 +103,9 @@ adding an empty line, make text **bold** or ``monospaced``.
<category name="adaptivity">
<parameter name="policy">
<definition> Switches the target policy to do adaptive grid refinement
(h-adaptivity). If enabled, an unstructured grid manager with higher
computational cost is used when using rectangular / cubic grids.
<definition> Switches the target policy to do adaptive grid refinement
(h-adaptivity). If enabled, an unstructured grid manager with higher
computational cost is used when using rectangular / cubic grids.
</definition>
<values> none, waterFlux </values>
<suggestion> none </suggestion>
......@@ -175,10 +175,10 @@ adding an empty line, make text **bold** or ``monospaced``.
</category>
<category name="output">
<parameter name="verbose">
<definition> Overall verbosity of the program </definition>
<suggestion> 0 </suggestion>
<values> 0, 1, 2, 3 </values>
<parameter name="logLevel">
<definition> Logging level of the core functions. </definition>
<suggestion> info </suggestion>
<values> trace, debug, info, warning, error, critical </values>
</parameter>
</category>
......
......@@ -32,10 +32,10 @@ adding an empty line, make text **bold** or ``monospaced``.
<dorie>
<category name="output">
<parameter name="verbose">
<definition> Verbosity of the Richards simulation </definition>
<suggestion> 0 </suggestion>
<values> 0, 1, 2, 3 </values>
<parameter name="logLevel">
<definition> Logging level of the Richards solver. </definition>
<suggestion> info </suggestion>
<values> trace, debug, info, warning, error, critical </values>
</parameter>
<parameter name="outputPath">
......
......@@ -32,6 +32,12 @@ adding an empty line, make text **bold** or ``monospaced``.
<dorie>
<category name="output">
<parameter name="logLevel">
<definition> Logging level of the Richards solver. </definition>
<suggestion> info </suggestion>
<values> trace, debug, info, warning, error, critical </values>
</parameter>
<parameter name="verbose">
<definition> Overall verbosity of the program </definition>
<suggestion> 0 </suggestion>
......
......@@ -5,9 +5,9 @@ if(dune-testtools_FOUND)
endif()
add_executable("dorie" dorie.cc)
dune_target_link_libraries(dorie dorie-impl ${DUNE_LIBS})
dune_target_link_libraries(dorie dorie-impl ${DUNE_LIBS} spdlog)
# enable setting operator scheme from config file
if(EXPERIMENTAL_DG_FEATURES)
target_compile_definitions("dorie" PUBLIC -DEXPERIMENTAL_DG_FEATURES)
endif()
\ No newline at end of file
endif()
......@@ -54,7 +54,7 @@ private:
//! Config file tree
const Dune::ParameterTree& _config;
//! Verbosity of this class
const int _verbose;
const std::shared_ptr<spdlog::logger> _log;
//! The mode for reading the grid.
GridMode _grid_mode;
//! Typedef for the created grid mapper
......@@ -65,15 +65,17 @@ public:
const Dune::MPIHelper& helper):
_helper(helper),
_config(config),
_verbose(_config.get<int>("output.verbose"))
_log(Dorie::get_logger(log_base))
{
const auto grid_type = _config.get<std::string>("grid.gridType");
if (grid_type == "rectangular")
_grid_mode = GridMode::rectangular;
else if (grid_type == "gmsh")
_grid_mode = GridMode::gmsh;
else
else {
_log->error("Unsupported grid type: {}", grid_type);
DUNE_THROW(IOError, "Unsupported grid type!");
}
check_parallel_allowed();
}
......@@ -99,6 +101,8 @@ public:
grid, element_index_map, boundary_index_map);
}
else {
_log->error("DORiE expects to work on UGGrid for "
"grid type: gmsh");
DUNE_THROW(IOError, "Incompatible GridTypes and grid creation "
<< "algorithms");
}
......@@ -123,9 +127,9 @@ public:
{
if (_helper.size() > 1
&& not std::is_same_v<YaspGrid<Grid::dimension>, Grid>) {
std::cout << "WARNING! DORiE's solver does not support parallel "
<< "execution on unstructured grids. This will likely result "
<< "in a fatal exception." << std::endl;
_log->warn("DORiE's solver does not support parallel "
"execution on unstructured grids. This will likely "
"result in a fatal exception (later on)");
return false;
}
......@@ -142,19 +146,11 @@ private:
std::tuple<std::shared_ptr<Grid>, std::vector<int>, std::vector<int>>
create_gmsh_grid () const
{
Dune::Timer timer;
ios_base_all_saver restore(std::cout);
const int level = _config.get<int>("grid.initialLevel");
const std::string meshfilename = _config.get<std::string>("grid.gridFile");
if (_verbose > 0 && _helper.rank()==0){
std::cout << "GRID SETUP:" << std::endl;
std::cout << " Reading Gmsh file " << meshfilename;
if (_verbose < 3)
std::cout << " quietly" << std::endl;
else
std::cout << std::endl << "--- GMSH READER OUTPUT ---" << std::endl;
}
_log->info("Creating an unstructured grid from GMSH input file: {}",
meshfilename);
Dune::GridFactory<Grid> factory;
std::vector<int> boundary_index_map;
......@@ -164,22 +160,13 @@ private:
meshfilename,
boundary_index_map,
element_index_map,
_verbose > 2 ? true : false
false // verbose
);
}
auto grid = std::shared_ptr<Grid>(factory.createGrid());
grid->globalRefine(level);
if(_verbose>0 && _helper.rank()==0){
if (_verbose > 2)
std::cout << "--- GMSH READER OUTPUT END ---" << std::endl;
std::cout << " Global Refinement level: " << level << std::endl;
if (_verbose > 1) {
std::cout << std::setprecision(4) << std::scientific;
std::cout << "::: grid setup time" << std::setw(12) << timer.elapsed() << std::endl;
}
}
_log->debug(" Applying global refinement of level: {}", level);
grid->globalRefine(level);
return std::make_tuple(grid, element_index_map, boundary_index_map);
}
......@@ -193,29 +180,18 @@ private:
create_rect_grid () const
{
Dune::Timer timer;
ios_base_all_saver restore(std::cout);
const auto level = _config.get<int>("grid.initialLevel");
const auto upperRight = _config.get<Dune::FieldVector<double, _dim>>("grid.extensions");
const auto elements = _config.get<std::array<unsigned int, _dim>>("grid.cells");
_log->info("Creating a rectangular grid in {}D", Grid::dimension);
_log->debug(" Grid extensions: {}", to_string(upperRight));
_log->debug(" Grid cells: {}", to_string(elements));
auto grid = grid_cube_construction<Grid>(upperRight, elements);
grid->globalRefine(level);
if(_verbose > 0 && _helper.rank()==0){
std::cout << "GRID SETUP:" << std::endl;
std::cout << " Creating Rectangular Grid in " << _dim << " Dimensions." << std::endl;
std::cout << " Grid Extensions: ";
for (unsigned int i=0; i<_dim; i++)
std::cout << upperRight[i] << " ";
std::cout << std::endl;
std::cout << " Grid Cells: ";
for (unsigned int i=0; i<_dim; i++)
std::cout << elements[i] << " ";
std::cout << std::endl;
std::cout << " Global Refinement level: " << level << std::endl;
std::cout << std::setprecision(4) << std::scientific;
std::cout << "::: grid setup time" << std::setw(12) << timer.elapsed() << std::endl;
}
_log->debug(" Applying global refinement of level: {}", level);
grid->globalRefine(level);
// get the element index map from an H5 file
auto [element_index_map, cells] = read_element_index_map_from_file();
......@@ -295,7 +271,10 @@ private:
1,
std::multiplies<int>());
// fill index map with the global id and copy over the cells
_log->debug("Mapping global medium index '{}' to entire grid",
global_id);
element_index_map.resize(total_size, global_id);
std::copy(elements.begin(), elements.end(), cells.begin());
......@@ -304,9 +283,12 @@ private:
// load heterogeneous mapping
auto dataset_name = _config.get<std::string>("grid.mappingFileDataset");
_log->debug("Loading medium mapping from file. "
"Filename: {}, Dataset: {}",
file_path, dataset_name);
// create H5 file reader
H5File file(file_path, _verbose);
H5File file(file_path);
// Read the data
file.read_dataset(dataset_name,
......@@ -320,12 +302,18 @@ private:
// check if cells has correct size and extensions
if (cells.size() != Grid::dimension) {
_log->error("Mapping dataset dimensions do not match "
"grid dimensions. Expected: {}, Read: {}",
Grid::dimension, cells.size());
DUNE_THROW(IOError, "Mapping dataset has wrong dimensions!");
}
for (size_t i = 0; i < _dim; ++i) {
if (cells[i] != elements[i]) {
DUNE_THROW(IOError, "Mapping dataset extensions don't match"
<< " the grid!");
_log->error("Mapping dataset extensions do not match grid "
"extensions. Expected: {}, Read: {}",
to_string(elements), to_string(cells));
DUNE_THROW(IOError, "Mapping dataset extensions don't match "
" the grid");
}
}
......
......@@ -9,6 +9,8 @@
#include <hdf5.h>
#include <dune/dorie/common/logging.hh>
namespace Dune {
namespace Dorie {
......@@ -17,8 +19,10 @@ class H5File
{
private:
//! Verbosity of this class
const int _verbose;
//! Type of the logger for brevity
using Logger = std::shared_ptr<spdlog::logger>;
//! The logger of this instance
const Logger _log;
//! ID of the H5 file
hid_t _file_id;
//! ID of the currently opened group
......@@ -26,14 +30,19 @@ private:
public:
/// Open the H5 file and its base group.
/** \param file_path Path to the H5 file
* \param verbose Verbosity of the output
/**
* \param file_path Path to the H5 file
* \param log The logger to use for this instance,
* defaults to retrieving the base logger.
*/
explicit H5File(const std::string& file_path, const int verbose):
_verbose(verbose),
explicit H5File(const std::string& file_path,
const Logger log=get_logger(log_base)):
_log(log),
_file_id(-1),
_group_id(-1)
{
_log->info("Opening H5 file: {}", file_path);
// set up property list for collective I/O
hid_t h5_plist_id = H5Pcreate(H5P_FILE_ACCESS);
H5Pset_fapl_mpio(h5_plist_id, MPI_COMM_WORLD, MPI_INFO_NULL);
......@@ -49,7 +58,8 @@ public:
// check for errors regardless of build type
if (_file_id < 0) {
throw std::runtime_error("Cannot open H5 file " + file_path);
_log->error("Failed to open H5 file: {}", file_path);
throw std::runtime_error("Cannot open H5 file");
}
// open the base group
......@@ -59,6 +69,8 @@ public:
/// Close the H5 file when this object is destroyed
~H5File ()
{
_log->trace("Closing H5 file");
// close the opened group
herr_t status = H5Gclose(_group_id);
assert(status > -1);
......@@ -81,9 +93,11 @@ public:
}
// open the new group
_log->trace("Opening H5 group: {}", group_path);
_group_id = H5Gopen(_file_id, group_path.c_str(), H5P_DEFAULT);
if (_group_id < 0) {
throw std::runtime_error("Cannot open H5 group " + group_path);
_log->error("Failed to open H5 group: {}", group_path);
throw std::runtime_error("Error opening H5 group");
}
}
......@@ -113,20 +127,19 @@ public:
std::vector<ext_t>& shape
)
{
if(_verbose > 1) {
std::cout << "Reading H5 parameter field " << dataset_path
<< std::endl;
}
_log->info("Reading H5 dataset: {}", dataset_path);
// open a subgroup if the dataset path is nested
const auto dataset_name = open_nested_dataset(dataset_path);
// open the dataset
_log->trace("Opening H5 dataset: {}", dataset_name);
hid_t dataset_id = H5Dopen(_group_id,
dataset_name.c_str(),
H5P_DEFAULT);
if (dataset_id < 0) {
throw std::runtime_error("Cannot open dataset " + dataset_path);
_log->error("Failed to open H5 dataset: {}", dataset_path);
throw std::runtime_error("Error opening H5 dataset");
}
// get the dataspace
......@@ -135,6 +148,7 @@ public:
// get the dimension (2-d or 3-d)
const hsize_t arr_dim = H5Sget_simple_extent_ndims(dataspace_id);
_log->debug(" Dataset dimensions: {}", arr_dim);
// get the size of the problem
std::vector<hsize_t> dims(arr_dim, 0);
......@@ -142,6 +156,9 @@ public:
dims.data(),
0);
assert(status > -1);
// log dataset extensions as read
_log->debug(" Dataset extensions: {}", to_string(dims));
const auto local_size = std::accumulate(dims.begin(),
dims.end(),
......@@ -178,8 +195,9 @@ public:
);
if (status < 0) {
throw std::runtime_error("Error reading data from dataset "
+ dataset_path);
_log->error("Failed to read data from H5 dataset: {}",
dataset_path);
throw std::runtime_error("Error reading H5 dataset");
}
// close the identifiers
......
......@@ -12,6 +12,8 @@
#include <dune/common/fvector.hh>
#include <dune/common/exceptions.hh>
#include <dune/dorie/common/logging.hh>
namespace Dune {
namespace Dorie {
......@@ -174,23 +176,44 @@ struct InterpolatorFactory
{
/// Create the interpolator.
/** \param type The type of interpolator to use
* \param args Arguments passed to the interpolator constructor
/** Use perfect forwarding for the data.
* \param type The type of interpolator to use
* \param data The data array for the interpolator
* \param shape The shape of the original dataset
* \param extensions The physical extensions to map the data to
* \param offset The offset of the field with respect to the origin
* \param log The logger for the created object, defaults to
* Dune::Dorie::log_base
* \return Shared pointer to the interpolator
*/
template<typename DataType, typename... Args>
static auto create (const std::string& type, Args&&... args)
-> std::shared_ptr<Interpolator<DataType, Traits>>
template<typename Data, typename Shape, typename Domain>
static auto create (
const std::string& type,
Data&& data,
Shape&& shape,
Domain&& extensions,
Domain&& offset,
const std::shared_ptr<spdlog::logger> log=get_logger(log_base)
)
-> std::shared_ptr<Interpolator<
typename std::remove_reference_t<Data>::value_type,
Traits>>
{
using NNInterp = NearestNeighborInterpolator<DataType, Traits>;
log->debug("Creating interpolator of type: {}", type);
using data_t = typename std::remove_reference_t<Data>::value_type;
using NNInterp = NearestNeighborInterpolator<data_t, Traits>;
if (type == NNInterp::type) {
return std::make_shared<NNInterp>(
std::forward<Args>(args)...
std::forward<Data>(data),
std::forward<Shape>(shape),
std::forward<Domain>(extensions),
std::forward<Domain>(offset)
);
}
else {
DUNE_THROW(Dune::NotImplemented, "Interpolator " + type + "is not "
"available!");
log->error("Unknown interpolator type: {}", type);
DUNE_THROW(Dune::NotImplemented, "Unknown interpolator type");
}
}
......
#ifndef DUNE_DORIE_LOGGING_HH
#define DUNE_DORIE_LOGGING_HH
#include <memory>
#include <string>
#include <spdlog/spdlog.h>
#include <spdlog/sinks/stdout_color_sinks.h>
#include <dune/common/exceptions.hh>
#include <dune/common/parallel/mpihelper.hh>
namespace Dune {
namespace Dorie {
/**
* \addtogroup Logging
* @{
*/
/// The name of the basic/common logger
inline const std::string log_base = "base";
/// The name of the Richards model logger
inline const std::string log_richards = "RIC";
/// The name of the Transport model logger
inline const std::string log_transport = "SOL";
/// Wrapper around the function spdlog::get, which throws exceptions
/** \param name The name of the logger to retrieve
* \throw InvalidStateException The logger does not exist
*/
inline std::shared_ptr<spdlog::logger> get_logger (const std::string name)
{
auto logger = spdlog::get(name);
if (not logger) {
DUNE_THROW(InvalidStateException, "Logger " << name <<
" does not exist");
}
return logger;
}
/// Restrict this function to the current scope
namespace
{
/// Create a logger or retrieve it if it already exists
/** \param name The name of the logger (also used for lookup)
* \param warn_on_exist Issue a warning if the logger already exists.
*/
std::shared_ptr<spdlog::logger> create_or_get (
const std::string name,
const bool warn_on_exist=true)
{
auto logger = spdlog::get(name);
if (not logger) {
logger = spdlog::stdout_color_mt(name);
}
else if (logger && warn_on_exist) {
logger->warn("Skipping creation of logger '{}' because "
"it already exists.", name);
}
return logger;
}
}
/// Create a logger with a certain name and log level.
/** If created in parallel, reduce log level for loggers that are not
* on the primary process (rank 0). Sets new logging levels if the
* logger(s) already exist(s).
* \param name Name of the logger
* \param helper The Dune MPIHelper
* \param level The log level of the (primary) logger.
* \param level_par The log level of the secondary loggers, if the
* program runs in parallel.
*/
inline std::shared_ptr<spdlog::logger> create_logger (
const std::string name,
const Dune::MPIHelper& helper,
const spdlog::level::level_enum level = spdlog::level::warn,
const spdlog::level::level_enum level_par = spdlog::level::err
)
{
// create the logger if it does not already exist
auto logger = create_or_get(name);
// Set to secondary level if logger runs on secondary process
if (helper.rank() == 0) {
logger->set_level(level);
}
else {
logger->set_level(level_par);
}
// If run in parallel, add process ID to pattern
if (helper.size() > 1) {
// "[HH:MM:SS.mmm level] (Proc ID) [logger] <message>"
logger->set_pattern("%^[%T.%e %L]%$ (%P) [%n] %v");
}
else {
// "[HH:MM:SS.mmm level] [logger] <message>"
logger->set_pattern("%^[%T.%e %L]%$ [%n] %v");
}
return logger;
}
/// Create the basic DORiE logger with a certain log level and fixed name.
/** If created in parallel, reduce log level for loggers that are not
* on the primary process (rank 0). Sets new logging levels if the
* logger(s) already exist(s). The logger automatically receives the name
* Dune::Dorie::log_base and can be retrieved with its value.
* \param helper The Dune MPIHelper
* \param level The log level of the (primary) logger.
* \param level_par The log level of the secondary loggers, if the
* program runs in parallel.
*/
inline std::shared_ptr<spdlog::logger> create_base_logger (
const Dune::MPIHelper& helper,
const spdlog::level::level_enum level = spdlog::level::warn,
const spdlog::level::level_enum level_par = spdlog::level::err
)
{
// retrieve the usual logger
auto logger = create_logger(log_base,
helper,
level,
level_par);
// Set different pattern without logger name
// If run in parallel, add process ID to pattern
if (helper.size() > 1) {