Publications

Publications last year

2024

Assessing and Improving the Robustness of Bayesian Evidential Learning in One dimension for inverting Time-Domain main Electromagnetic Data: Introducing a new Threshold Procedure.

Authors: Arsalan Ahmed, Lukas Aigner, Hadrien Michel, Wouter Deleersnyder, David Dudal, Adrian Flores Orozco, Thomas Hermans

Type: A1 Journal Article in Water Journal

Full text

Abstract:

Imagine you’re trying to figure out what’s hidden underneath a giant blanket without lifting it. Instead of guessing based on a few peeks under the blanket (like drilling holes in the ground), scientists have ways to ‘feel’ what’s beneath from the surface using geophysical methods like “Time domain electromagnetic data”, which are much cheaper and less harmful to the environment. But the catch is, these ‘feels can be interpreted in many ways, making it tricky to know exactly what’s under there.

This study talks about a smart method, kind of like using an insightful guide, to make better guesses about what’s hidden under our ‘blanket’—the Earth’s surface. This guide, called Bayesian evidential learning, helps us to understand the uncertainties in our guesses, especially when we’re looking at how things might change over time and space underground, like tracking unwanted saltwater getting into freshwater areas in Vietnam.

The cool part? The study shows that with some clever tricks (“the introduction of a new threshold”), we can avoid super expensive or complex computations and still get trustworthy results. This means we can take better care of our environment and resources without breaking the bank or causing unnecessary damage.


Quantifying salinity in heterogeneous coastal aquifers through ERT and IP: Insights from laboratory and field investigations

Authors: Diep Cong-ThiLinh Pham Dieu, David Caterina, Xavier De Pauw, Huyen Dang Thi, Huu Hieu Ho, Frédéric Nguyen, Thomas Hermans

Type: A1 Journal Article in Journal of Contaminant Hydrology

Full text

Abstract:

ERT and IP are considered to be sensitive for classifying saltwater-bearing and clay-rich sediments in the coastal aquifers. To interpret the ERT data, we rely on the petrophysical relationship with mention of the surface conductivity contribution to the total bulk conductivity. 

Firstly, the sedimentary samples collected from the co-located boreholes were classified according to their particle size distribution and analyzed in the lab using a spectral IP in controlled salinity conditions to derive their formation factors, surface conductivity, and normalized chargeability. Secondly, the deduced thresholds are applied on the field to distinguish clay-bearing sediments from brackish sandy sediments. 

Application of this approach in the Luy River catchment revealed variations in formation factors in various unconsolidated sedimentary classifications, ranging from 4.0 to 8.9 for coarse-grained sand and clay-bearing mixtures, respectively. The presence of clay content is detected in comparison with the normalized chargeability conditions over 1.0 mS.m-1. Clay-bearing sediments were found to be predominantly distributed in discontinuous small lenses, intercalating with sandy layers. This highlights the heterogeneous nature of these coastal aquifers (Diep Cong-Thi et al, 2024). 

We hope that these methods can be widely applied to the heterogeneous clay-bearing aquifers in the various coastal regions. 

This research is funded by VLIR-UOS and the Belgian development cooperation through the grant VN2019TEA494A103, and theSpecial Research Fund (BOF), Ghent University. 


Response to soil compaction of the electrical resistivity tomography, induced polarisation, and electromagnetic induction methods: a case study in Belgium

Authors: Danial Mansourian, Adriaan Vanderhasselt, Wim Cornelis and Thomas Hermans

Type: A1 Journal Article in Soil Research

Full text

Abstract:

Context: soil Compaction acts at different scales and is challenging to measure on field scales.

Aims: to evaluate soil compaction under a controlled traffic experiment, using three different geophysical methods.

Methods: Electrical Resistivity Tomography (ERT), Electromagnetic Induction (EMI), and Induced Polarization (IP) were selected to map soil compaction. Two different ERT arrays and EMI geometries were selected with different spacings. The influences of configuration, electrode spacing, and the depth of investigation Index (DOI) were evaluated. Soil physical properties were measured in the Laboratory and in the field. Error models were developed to assess the accuracy of the ERT profiles and later correlated with EMI and soil physical results.

Key results: Penetration resistance measurements identified a compacted layer at 25 to 35 cm depth with a maximum value of 5 MPa under fixed tracks and bulk density of 1.52 Mg m-3, while lowest values were 1.4 MPa and 1.36 Mg m-3. The Dipole-Dipole 10 cm array was more accurate towards both soil properties and locating the zones of high resistivity. The IP method identified chargeability anomalies at the same depth as the resistivity anomalies, possibly indicating a similar origin. The EMI test was less successful in accurately determining the locations of the conductive areas.

Conclusions: A clear relationship between the absolute value of the resistivity/conductivity signals with the level of compaction was not found, yet patterns of lateral variations in resistivity were identified.

Implications: Further studies are needed to establish the concrete relationship between soil compaction and geophysical signals.


Integrated methodology to link geochemical and geophysical-lab data in a geophysical investigation of a slag heap for resource quantification

Authors: Itzel Isunza Manrique, Thomas Hermans , David Caterina, Damien Jougnot, Benoît Mignon, Antoine Masse and Frédéric Nguyen

Type: A1 Journal Article in Journal of Environmental Management

Full text

Abstract:

The increasing need to find alternative stocks of critical raw materials drives to revisit the residues generated during the former production of mineral and metallic raw materials. Geophysical methods contribute to the sustainable characterization of metallurgical residues inferring on their composition, zonation and volume(s) estimation. Nevertheless, more quantitative approaches are needed to link geochemical or mineralogical analyses with the geophysical data. In this contribution, we describe a methodology that integrates geochemical and geophysical laboratory measurements to interpret geophysical field data solving a classification problem. The final aim is to estimate volume(s) of different types of materials to assess the potential resource recovery. We illustrate this methodology with a slag heap composed of residues from a former iron and steel factory. First, we carried out a 3D field acquisition using electrical resistivity tomography (ERT) and induced polarization (IP), based on which, a sampling survey was designed. We conducted laboratory measurements of ERT, IP, spectral induced polarization (SIP), and X-ray fluorescence analysis, based on which, 4 groups of different chemical composition were identified. Then we carried out a 3D probabilistic classification of the field data, based on 2D kernel density estimators (for each group) fitted to the inverted data collocated with the samples. The estimated volumes based on the classification model were: 4.17 × 103 m3 ± 12 %, 1.888 × 105 m3 ± 12 %, 59.4 × 103 m3 ± 19 %, and 2.30 × 104 m3 ± 21% for the groups ordered with an increasing metallic content. The uncertainty ranges were derived from comparing the volumes with and without considering the probabilities associated to the classification. We found that a representative sampling and the definition of the KDE bandwidths are defining elements in the classification and ultimately the estimation of volumes. This methodology is suitable to quantitatively interpret geophysical data in terms of the geochemical composition of the materials, integrating uncertainties both in the classification and the estimation of volumes. Furthermore, several crucial elements in the investigation of metallurgical residues could be applied in a real case study, e.g., geophysical field acquisition, sampling and lab measurements.


2023

A new Bayesian framework for the interpretation of geophysical data

Authors: Hadrien Michel, Frédéric Nguyen, Thomas Hermans

Type: Dissertation

Full text

Abstract: Providing images of the subsurface from ground-based datasets is at the heart of the geophysicist’s work. Multiple approaches have been applied to tackle this task. Most of the time, this task is performed in a deterministic framework, meaning that for a given dataset, a single model is provided to explain the data. However, those deterministic approaches lack the ability to provide reasonable uncertainty estimations, that take into account the non-unicity of the solution, noise in the data and modelling error. To provide precise and accurate models of the subsurface along with uncertainty, geophysicists use probabilistic approaches. Those approaches are able to sample the ensemble of a priori possible models (the prior) in order to extract models that can reasonably explain the datasets (the posterior). Such approaches, even though superior in terms of the reliability of their results, are rarely applied in practice due to their significant computational requirements. In this manuscript, the aim is to propose a new Bayesian framework to interpret those geophysical datasets. This new framework, called Bayesian Evidential Learning, promises to enable a fast, precise and accurate estimation of the uncertainty. This framework is applied and adapted for 1D geophysical datasets (BEL1D). The new and adapted framework presents several advantages when compared to classical probabilistic approaches: from fast computations due to the limited number of forward runs needed, to providing insight about the experiment sensitivity and the validity of the prior. Moreover, it benefits from its construction as a Machine Learning algorithm, leading to quasi-instantaneous models of uncertainty.


Advancing measurements and representations of subsurface heterogeneity and dynamic processes: towards 4D hydrogeology

Authors: Thomas Hermans (UGent) , Pascal Goderniaux, Damien Jougnot, Jan H. Fleckenstein, Philip Brunner, Frédéric Nguyen, Niklas Linde, Johan Alexander Huisman, Olivier Bour, Jorge Lopez Alvis, et al.

Type: A1 Journal Article in Hydrology and Earth System Sciences

Full text

Abstract: Essentially all hydrogeological processes are strongly influenced by the subsurface spatial heterogeneity and the temporal variation of environmental conditions, hydraulic properties, and solute concentrations. This spatial and temporal variability generally leads to effective behaviors and emerging phenomena that cannot be predicted from conventional approaches based on homogeneous assumptions and models. However, it is not always clear when, why, how, and at what scale the 4D (3D + time) nature of the subsurface needs to be considered in hydrogeological monitoring, modeling, and applications. In this paper, we discuss the interest and potential for the monitoring and characterization of spatial and temporal variability, including 4D imaging, in a series of hydrogeological processes: (1) groundwater fluxes, (2) solute transport and reaction, (3) vadose zone dynamics, and (4) surface-subsurface water interactions. We first identify the main challenges related to the coupling of spatial and temporal fluctuations for these processes. We then highlight recent innovations that have led to significant breakthroughs in high-resolution space-time imaging and modeling the characterization, monitoring, and modeling of these spatial and temporal fluctuations. We finally propose a classification of processes and applications at different scales according to their need and potential for high-resolution space-time imaging. We thus advocate a more systematic characterization of the dynamic and 3D nature of the subsurface for a series of critical processes and emerging applications. This calls for the validation of 4D imaging techniques at highly instrumented observatories and the harmonization of open databases to share hydrogeological data sets in their 4D components.


An integrated geoarchaeological approach to the investigation of multi-period prehistoric settlements – the case of Neolithic Drenovac

Authors: Philippe De Smedt (UGent) , Charles French, Timothy Kinnaird, Tonko Rajkovača, Aleksandar Milekić, Petros Chatzimpaloglou, Jeroen Verhegge (UGent) , Thomas Hermans (UGent) , Gaston Mendoza Veirana (UGent) , Djurdja Obradović, et al.

Type: U Conference

Full text

Abstract: A multi-method geoarchaeological investigation was performed to reconstruct multi-phase Neolithic settlement. Invasive and non-invasive surveys showed potential for providing archaeological and environmental landscape data in this complex setting. Large-area geophysical surveys showed potential for deriving stratigraphic information.


Assessing the potential of low‑transmissivity aquifers for aquifer thermal energy storage systems: a case study in Flanders (Belgium)

Authors: Luka Tas (UGent), David Simpson and Thomas Hermans (UGent)

Type: A1 Journal Article in Hydrogeology Journal

Full text

Abstract: The Member States of the European Union pledged to reduce greenhouse gas emissions by 80–95% by 2050. Shallow geothermal systems might substantially contribute by providing heating and cooling in a sustainable way through seasonally storing heat and cold in the shallow ground (<200 m). When the minimum yield associated with the installation of a cost-effective aquifer thermal energy storage (ATES) system cannot be met, borehole thermal energy storage, relying mostly on the thermal conductivity of the ground, is proposed. However, for large-scale applications, this requires the installation of hundreds of boreholes, which entails a large cost and high disturbance of the underground. In such cases, ATES systems can nevertheless become interesting. This paper presents a case study performed on a Ghent University campus (Belgium), where the feasibility of ATES in an area with a low transmissivity was determined. The maximum yield of the aquifer was estimated at 5 m3/h through pumping tests. Although this low yield was attributed to the fine grain size of the aquifer, membrane filtering index tests and long-term injection tests revealed that the clogging risk was limited. A groundwater model was used to optimize the well placement. It was shown that a well arrangement in a checkerboard pattern was most effective to optimize the hydraulic efficiency while maintaining the thermal recovery efficiency of the ATES system. Hence, for large-scale projects, efficient thermal energy storage can also be achieved using a (more cost-effective) ATES system even in low-permeability sediments.


Comparison of soft indicator and poisson kriging for the noise-filtering and downscaling of areal data : application to daily COVID-19 incidence rates

Authors: Pierre Goovaerts, Thomas Hermans (UGent) , Peter Goossens and Ellen Van De Vijver (UGent)

Type: A1 Journal Article in ISPRS International Journal of Geo-Information

Full text

Abstract: This paper addresses two common challenges in analyzing spatial epidemiological data, specifically disease incidence rates recorded over small areas: filtering noise caused by small local population sizes and deriving estimates at different spatial scales. Geostatistical techniques, including Poisson kriging (PK), have been used to address these issues by accounting for spatial correlation patterns and neighboring observations in smoothing and changing spatial support. However, PK has a limitation in that it can generate unrealistic rates that are either negative or greater than 100%. To overcome this limitation, an alternative method that relies on soft indicator kriging (IK) is presented. The performance of this method is compared to PK using daily COVID-19 incidence rates recorded in 2020–2021 for each of the 581 municipalities in Belgium. Both approaches are used to derive noise-filtered incidence rates for four different dates of the pandemic at the municipality level and at the nodes of a 1 km spacing grid covering the country. The IK approach has several attractive features: (1) the lack of negative kriging estimates, (2) the smaller smoothing effect, and (3) the better agreement with observed municipality-level rates after aggregation, in particular when the original rate was zero.


Editorial : geomechanics and induced seismicity for underground energy and resources exploitation

Authors: Longjun Dong, Wenzhuo Cao and Thomas Hermans (UGent)

Type: Editorial in Frontiers in Earth Science

Full text

Keywords: General Earth and Planetary Sciences, geomechanics, microseismic monitoring, underground engineering, induced seismicity geo-energy, mineral resources


Flexible quasi-2D inversion of time-domain AEM data, using a wavelet-based complexity measure

Authors: Wouter Deleersnyder (UGent) , Benjamin Maveau, Thomas Hermans (UGent) and David Dudal (UGent)

Type: A1 Journal Article in Geophysical Journal International

Full text

Abstract: Regularization methods improve the stability of ill-posed inverse problems by introducing some a priori characteristics for the solution such as smoothness or sharpness. In this contribution, we propose a multidimensional scale-dependent wavelet-based l(1)-regularization term to cure the ill-posedness of the airborne (time-domain) electromagnetic induction inverse problem. The regularization term is flexible, as it can recover blocky, smooth and tunable in-between inversion models, based on a suitable wavelet basis function. For each orientation, a different wavelet basis function can be used, introducing an additional relative regularization parameter. We propose a calibration method to determine (an educated initial guess for) this relative regularization parameter, which reduces the need to optimize for this parameter and, consequently, the overall computation time is under control. We apply our novel scheme to a time-domain airborne electromagnetic data set in Belgian saltwater intrusion context, but the scheme could equally apply to any other 2D or 3D geophysical inverse problem.


Footprint of fresh submarine groundwater discharge in the Belgian coastal zone : an overview study

Authors: Marieke Paepen (UGent) , Kristine Walraevens (UGent) and Thomas Hermans (UGent)

Type: A2 Journal Article in Leading Edge

Full text

Abstract: The presence of fresh groundwater is not limited to land; it also extends offshore and discharges as submarine groundwater discharge (SGD). The freshwater component of SGD (fresh submarine groundwater discharge [FSGD]) can be detected using geophysical techniques that are sensitive to salinity such as resistivity measurements. However, these measurements are often limited to either the land or marine realm, neglecting the land-marine interface. In this study, we focus on this gap by combining onshore and offshore techniques to assess variability in the FSGD footprint near the Belgian coastline through electrical resistivity tomography and continuous resistivity profiling. The difficult working conditions of the highly dynamic North Sea make this offshore survey one of the first of its kind. The footprint varies from limited outflow on the upper beach (e.g., Wenduine) to discharge around and below the low water line (e.g., De Panne, Oostduinkerke, and Knokke-Heist) in the studied areas. The occurrence, footprint, and quantity of SGD seem to be controlled by the presence and size of dune formations that constitute freshwater resources along the shore. Heterogeneity can also play a determining factor in FSGD location.


Ground validation of satellite-based precipitation estimates over poorly gauged catchment: the case of the Drâa basin in Central-East Morocco

Authors: Athmane Khettouch, Mohammed Hssaisoune, Thomas Hermans (UGent) , Aziz Aouijil and Lhoussaine Bouchaou

Type: U Journal Article in Mediterranean Geoscience Reviews

Full text

Abstract: In the most ungauged areas, lack of precipitation information limits the accuracy of water balance approaches. The ungauged Drâa river basin (DRB) in center eastern of Morocco is one of the ten driest basins worldwide with lack of adequate rainfall dataset for water resources management. This study assess five satellite precipitation datasets (P-datasets) with high spatial resolution (0.0375°–0.1°) and long period of record (> 40 years), namely CHIRPS V2.0, MSWEP V2.8, PERSIANN-CCS-CDR, TAMSAT V3.1, and ERA5-Land with reference to ground rain observations, based on continuous, categorical, and volumetric indices, and at various elevations, rainfall intensities, and temporal scales (i.e., monthly, seasonal, and sub-seasonal). Moreover, the ability to detect extreme precipitation event and the suitability of the conventional rain gauge to increase the magnitude of error was also quantified. ERA5-Land followed by MSWEP V2.8 have shown the best statistical scores at different times, intensity and elevation scales, while CHIRPS V2.0, PERSIANN-CCS-CDR and TAMSAT V3.1 provide poor estimations of rainfall with high sensitivity to the complexity of terrains. CHIRPS V2.0 was more efficient in detecting wet months, but a large event detected were not confirmed by ground observation. The magnitude of error tends to decrease during summer and for precipitation within the range of 1–12.5 mm for all P-datasets. However, all the five products underestimate the frequency of dry months and overestimate high precipitation intensity. Our findings not only recommend ERA5-Land and MSWEP V2.8 datasets as alternative to rain gauge but also describe the limitation of CHIRPS V2.0, TAMSAT V3.1 and PERSIANN-CSS-CDR for such hydrological studies.


Machine learning for Bayesian experimental design in the subsurface

Authors: Robin Thibaut (UGent)

Type: Dissertation

Full text

Abstract: Accurate modeling of the subsurface, a complex and heterogeneous environment that plays a crucial role in the Earth’s water cycle, is challenging due to sparse and incom- plete data. We can reduce the uncertainty associated with subsurface predictions, such as groundwater flow and contaminant transport, by conducting additional observations and measurements in the subsurface. However, practical and economic considerations frequently limit the number of measurements and their locations, such as land occupa- tion, which may limit the number of wells that can be drilled. In this dissertation, we propose simulation-driven methods to reduce uncertainty in subsurface predictions by identifying the most informative data sets to gather. Our method, which is based on Bayesian optimal experimental design and machine learning, determines the nature and location of these data sets, which can include measurements of groundwater levels, tem- perature, and other parameters collected through active or passive sensing methods such as pumping tests, tracer tests, and geophysical surveys. This dissertation is the first to use Bayesian Evidential Learning (BEL) for optimal experimental design, allowing for the optimization of data source locations and the comparison of the utility of different data sources. BEL is a framework for prediction that combines Monte Carlo sampling and machine learning in order to learn a direct relationship between predictor and target variables generated by a simulation model. We demonstrate the efficacy of our methods in three groundwater modeling case studies: (i) wellhead protection area delineation, (ii) an aquifer thermal energy storage monitoring system, and (iii) groundwater-surface water interaction. The case studies show that our approach can significantly reduce the uncertainty in subsurface predictions and guide further subsurface exploration. The first case study, in particular, uses the Traveling Salesman Problem to introduce a novel ap- proach to wellhead protection area delineation. The second case study, which compares well and geophysical data for temperature monitoring, introduces a new method for com- bining observations from multiple data sources in a latent space of the original data. The third case study introduces the Probabilistic Bayesian neural network (PBNN) method to BEL and transitions from a static experimental design framework to a sequential experimental design framework, which estimates groundwater-surface water interaction fluxes from temperature data. We have also developed a Python package, SKBEL, that implements our methods and can be used for a variety of Earth Science applications. Overall, this dissertation demonstrates the utility of BEL for optimal experimental de- sign in groundwater modeling, highlights the potential of BEL for predictive modeling in Earth Sciences, and opens up new avenues for data and simulation-driven subsurface modeling.


Quantitative interpretation of geoelectric inverted data with a robust probabilistic approach

Authors: Itzel Isunza Manrique, David Caterina, Frederic Nguyen and Thomas Hermans (UGent)

Type: A1 Journal Article in Geophysics

Full text

Abstract: The nonuniqueness of the solution to the geophysical inverse problem can lead to misinterpretation while characterizing the subsurface. To tackle this situation, ground-truth information from excavations and wells can be used to improve, calibrate, and interpret inverted models. We refer to quantitative interpre-tation as the decision analysis based on probability theory, which is focused on solving a classification problem. First, we present a probabilistic approach to classify the different types of materials or “categories” observed in borehole logs using multiple data sources: inverted 2D electrical resistivity tomography and in-duced polarization data and the positions (x, z) of these boreholes. Then, using the Bayes’ rule and permanence of ratios, we com-pute the joint conditional probabilities of each category, given all data sources in the whole inverted model domain. We validate this approach with synthetic data modeling for a complex anthropogenic-geologic scenario and using real data from an old landfill. Afterward, we assess the performance of the probabilistic approach for classification and compare it with the machine learn-ing algorithm of multilayer perceptron (MLP). In addition, we analyze the effect that the different data sources and the number of boreholes (and their distribution) have on both approaches with the synthetic case. Our results indicate that the MLP performance is better for delineating the different categories where the lateral contrasts in the synthetic resistivity model are small. Never-theless, the classification obtained with the probabilistic approach using real data seems to provide a more geologically realistic distribution. We conclude that the probabilistic approach is robust for classifying categories when high spatial heterogeneity is ex-pected and when ground-truth data are limited or not sparsely distributed. Finally, this approach can be easily extended to inte-grate multiple geophysical methods and does not require the optimization of hyperparameters as for MLP.


The quantitative meaning of resistivity data in a coastal setting : a Belgian case study

Authors: Marieke Paepen (UGent) , Wouter Deleersnyder (UGent) , Kristine Walraevens (UGent) and Thomas Hermans (UGent)

Type: C3 Conference

Full text

Abstract: In coastal areas, the natural groundwater flow is affected by human activities, such as managed aquifer recharge (MAR) and groundwater extraction. They can induce saltwater intrusion and impact the fresh submarine groundwater discharge (FSGD). Resistivity methods, such as electrical resistivity tomography (ERT) and continuous resistivity profiling (CRP) are easy to use and very effective to assess the distribution of salt and freshwater in coastal environments. The Western Belgian coast, De Panne and Koksijde, was already investigated with ERT and CRP by Paepen et al. (2022; 2020). In this area, the source of FSGD is a sandy dune ridge of around 2.5 km wide. Now, we compare the FSGD footprint in front of De Panne and Koksijde to other Belgian coastal sites (Raversijde, Wenduine, Knokke-Heist, and Zwin), which have a different structure of the phreatic aquifer and a much smaller dune belt.

The quantitative interpretation of ERT and CRP is not straightforward, but image appraisal tools – such as the model resolution matrix (R), cumulative sensitivity matrix (S), and depth of investigation index (DOI) – can aid (Caterina et al., 2013). To be able to quantitatively assess the resistivity inversion models, five synthetic models were created (Paepen et al., 2022). These models reflect the present situation of the Western Belgian coast, where we find freshwater outflow on the lower beach or below the low water line. Based on the inversion models of the synthetic cases, the model resolution matrix, cumulative sensitivity matrix, and DOI (Oldenburg & Li, 1999) were calculated. The image appraisal tools were then compared to the error on the salinity for each cell in the inversion model (we find an error below 0.05 acceptable). This allows to define a threshold of the different image appraisal tools for which the model can be quantitatively assessed and to apply them to the field data. The thresholds reveal that no quantitative interpretation is possible for the zones of FSGD and that the FSGD resistivity is underestimated by the inversion process, so the salinity of the outflow is overestimated. Nevertheless, lateral qualitative changes can be deduced from the inversion models.

All publications