Elsevier

Computers & Geosciences

Volume 50, January 2013, Pages 4-15
Computers & Geosciences

Hierarchical benchmark case study for history matching, uncertainty quantification and reservoir characterisation

https://doi.org/10.1016/j.cageo.2012.09.011Get rights and content

Abstract

Benchmark problems have been generated to test a number of issues related to predicting reservoir behaviour (e.g. Floris et al., 2001, Christie and Blunt, 2001, Peters et al., 2010). However, such cases are usually focused on a particular aspect of the reservoir model (e.g. upscaling, property distribution, history matching, uncertainty prediction, etc.) and the other decisions in constructing the model are fixed by log values that are related to the distribution of cell properties away from the wells, fixed grids and structural features and fixed fluid properties. This is because all these features require an element of interpretation, from indirect measurements of the reservoir, noisy and incomplete data and judgments based on domain knowledge.

Therefore, there is a need for a case study that would consider interpretational uncertainty integrated throughout the reservoir modelling workflow. In this benchmark study we require the modeller to make interpretational choices as well as to select the techniques applied to the case study, namely the geomodelling approach, history matching algorithm and/or uncertainty quantification technique. The interpretational choices will be around the following areas:

  • (1)

    Top structure interpretation from seismic and well picks.

  • (2)

    Fault location, dimensions and the connectivity of the network uncertainty.

  • (3)

    Facies modelling approach.

  • (4)

    Facies interpretations from well logs cutoffs.

  • (5)

    Petrophysical property prediction from the available well data.

  • (6)

    Grid resolution-choice between number of iterations and model resolution to capture the reservoir features adequately.

A semi-synthetic study is based on real field data provided: production data, seismic sections to interpret the faults and top structures, wireline logs to identify facies correlations and saturation profile and porosity and permeability data and a host of other data. To make this problem useable in a manageable time period multiple hierarchically related gridded models were produced for a range of different interpretational choices.

Highlights

► We made a case study which requires the inclusion of interpretational uncertainties. ► Solution space described by 81 different available realisations. ► The model is not one of the provided scenarios. ► We have tried to match the reservoir using 2 simple approaches with some success. ► Challenge to match and will require approaches that look at all the solution space.

Introduction

Uncertainty is an issue we need to deal with in every reservoir due to sparse and/or low resolution data available and our limited knowledge about the reservoir. Creating a simulation model to forecast petroleum reservoir production requires the modeller to make a number of choices about how to construct a representative model. These choices range from which numerical value to assign a reservoir property based on measured data to interpretational choices such as the depositional model.

Typically a geomodeller would produce many 100's of realisations of the reservoir to assess the uncertainty in a pre-production field. Critically these models attempt to cover the uncertainties in reservoir through a range of possible numerical input parameters that produce variations in key reservoir features. Where we miss can out uncertainty is in the interpretational elements of modelling that are difficult to describe using numerical parameters, such as the choice of which faults are present, the picking of the top structure, the depositional model, cut off selection and permeability prediction models. Different possible interpretations of the reservoir for instance would require different conceptual models for the facies distributions and different modelling approaches.

The modelling choices made are related how complex a model of the real system we choose to create. The complexity issue is well known as the bias-variance trade-off where over-complex models result in inflation of the uncertainty (variance) while over-simplistic ones—to bias. A meta-model that subsumes specific candidate models would lead to considerable inflation of variance, therefore there is a need to fund ways of mixing model that are consistent with what we believe is plausible (Christie et al., 2011). On the other hand if the models from the ensemble come from the same code (are related) the unvariance may will be understated (P. Challenor et.al in Christie et al., 2011). Therefore, it is important to include the bias terms in the model corresponding to the effects not accounted by the model.

Once it comes to choosing from different modelling algorithms and multiple interpretations of data and knowledge, uncertainty quantifications inevitably become a subject to that choice, which is consistent with the subjective nature of uncertainty (Caers, 2011). From a probabilistic point of view uncertainty can be seen as a “correct” posteriori PDF, which is unknown and needs to be sampled for Oliver et al. (2008).

Furthermore, uncertainty can be seen as a relative term subject to the measurement scale and precision of how we define things we want to measure or estimate. Uncertainty analysis based on evaluating of covariance matrix for the posterior probability is difficult for complex problems when posterior pdf is complex itself. A Bayesian approach implies there is one “true” posterior to characterise the uncertainty which is defined as a multiplication of the prior probability and the likelihood. The determination of the unique posterior pdf of a multi-scenario model can become quite difficult. A Bayesian approach joins the probabilities by means of “strong” conjunction. Relaxation of the way this conjunction is performed would possibly provide a wider evaluation of uncertainty. There are non-Bayesian approaches to combine prior and evidence e.g. by means of conjunction (Tarantola, 2005). There have been attempts to use Dempster–Shafer generalisation of the Bayesian theory of subjective probability to measure uncertainty in reservoir geological properties (Kim, 2002). Another approach was proposed in perception-based theory of probabilistic reasoning, which uses fuzzy logic to describe perceptions and subjective probability Zadeh (2002).

The issue of multiple scenarios is compounded when we move to post-production fields and we want to add production data to the uncertainty quantification process. Here the model is history matched to the measured well production data and the quality of fit defines the likelihood of the model and is typically now done using a range of automated approaches (Floris et al., 2001). Uncertainty in automated HM approaches is usually described with an inference from an ensemble of multiple models which come from a prior range of the model parameters or state vectors, updated through each iteration. There are several difficulties associated with this. The uncertainty in the model definitions, components and complexity is not easy to take into account. For instance, if there are uncertainties about the structure it can become difficult to re-grid the model to new structural configurations automatically.

Distinguishing between different geological interpretations also becomes an issue. Data assimilation approaches, such as Ensemble Kalmar filters (Evensen et al., 2007), can potentially handle interpretational uncertainty within the range of the initial ensemble of realisations, however, the consistency of the converged ensemble with geologically realistic distributions may be an issue.

Particularly tricky aspects of the uncertainty to account for in history matching include the top structure, layering, faults (numbers, dimensions, displacements and locations) and other structural features. As a result we tend to keep these features fixed, preferring instead to change elements that are easy to parameterise in the simulation model. This means that significant uncertainties in the Gross rock volume (top structure), Net to Gross (layering) and partitioning (faults) of the reservoir are typically not accounted for by the uncertainty quantification process. These features are developed principally from seismic data and well correlations so are based on the interpretation of geologists/petrophysicists and are (a) subject to uncertainties in depth conversion and (b) have uncertainties due to the interpretation process, where the error cannot easily be quantified as we do not typically produce many independent seismic interpretations.

A number of studies have considered the uncertainty in seismic interpretation (e.g. Bond et al., 2007, Rankey and Mitchell, 2003) and show a potential for significant variation in the interpretation of the same data due to the background of the interpreter (Bond et al., 2007) and overconfidence in the quality of their interpretation (Rankey and Mitchell, 2003). In Rankey and Mitchell (2003), 6 interpreters were given the same seismic data from a carbonate reservoir. They all believed that because the seismic data was apparently easy to interpret, their subsequent interpretations were very accurate with only a small error in the location of the reservoir top. In fact, while most of the reservoir top was accurately identified by all the geologists studied, the edges of the reservoir were less well defined. A comparison of the 6 interpretations showed that the portions of the reservoir that were less well defined added considerable variation to the volumetric estimates, even though the other parts of the field were the same for all interpretations. There is therefore a need to develop techniques that account for the both the uncertainties in measured reservoir inputs that are easy to handle, such as relative permeability uncertainties, and interpretational uncertainties in the reservoir.

A further complication in accounting for the uncertainty is the grid resolution of the model. Most previous history matching case studies provide a single grid resolution model to match to real field production data, production data from a truth case model of the same resolution or a higher resolution model. The resolution of the grid and the accuracy of the gridding approach are important in history matching as the total run time available for the process is a limiting factor. In most cases we can either run many low resolution models that have a higher solution error in their predictions or fewer high resolution with less error but potentially do not identify the best history matched models, due to insufficient iterations of the model.

Solution errors are the difference between the exact mathematical solution and the numerical algorithm solution used to represent them in the simulation model. Any assumptions and simplifications from the mathematical model, errors in rounding numbers by the simulator or numerical errors due to the grid resolution all contribute to the solution errors. A solution error model was implemented to account for upscaling errors in O'Sullivan and Christie, 2005 and Christie et al., (2006).

There are also uncertainties associated with the missing physics in the model, which therefore may have an impact on the observations. This so called model inadequacy, once accounted for with a random statistical processes, improved quantification of the overall uncertainty based on the more consistent inference from the generated model response.

In this paper we describe a new case study, called the Watt Field (after James Watt the founder of Heriot-Watt University), we have developed to consider both the impact of how you choose to estimate the uncertainty and what model you choose to do this with. The benchmark study addresses mostly the issues associated with an early stage of a brown field development. Previous case studies have concentrated on the choice of uncertainty quantification method (PUNQ-S3 (Floris et al., 2001)) or facies/petrophysics modelling (such as the Brugge case study (Peters et al., 2010)). For the case of models like PUNQ-S3 many parameterisations were developed by different authors (Floris et al., 2001, Demyanov et al., 2004, Hajizadeh et al., 2010) to account for the model uncertainty within a given geological scenario. Stanford 6 case study provided a good example of a realistic synthetic full scale model with different depositional environments and associated seismic (Castro et al., 2005). The Brugge case study gave the modeller choices from 104 prior realisations of the geology along with high resolution log data and a choice of petrophysical correlations. SPE10 (Christie and Blunt, 2001) is a widely used model for upscaling studies which deals with the problems of grid resolution and solution errors. As yet no study has combined these choices with other uncertainties that need to be accounted for such as the shape of the top structure and the fault network uncertainty.

This paper proposes a new case study that includes the kinds of interpretational uncertainties present in a real reservoir as well as other uncertainties. Specifically we have produced a number of realisations of the model based on different top structure interpretations, fault models, facies modelling/depositional environment choices and relative permeability/capillary pressure uncertainties for a number grid resolution choices. Our intention is to encourage researchers to look into accounting for uncertainties in their interpretations of the reservoir and develop techniques that account for these uncertainties.

Section snippets

Uncertainty quantification of producing fields

Extensive work has been carried out on developing techniques for uncertainty quantification in petroleum reservoirs. Most common approaches develop a Bayesian methodology as it allows the engineer to update an initial estimate of uncertainty to a new posterior estimate using production data. Bayes' theorem is described by the relationship between the posterior and priorp(O|M)=p(O|m)p(m)mp(O|m)p(m)dmwhere P(m|O) is the posterior probability of the model, an updated estimate of the reservoir

Overview

The Watt field is a synthetic study based on a mixture of real field and synthetic data to describe a realistic field example seen through appraisal into the early development life stage. Top structure and wireline data is based on real field data however, the fluid properties, relative permeability, and capillary data are synthetic. The field development plan is also synthetic resulting in an artificial production response. The model spans a 12.5 km by 2.5 km surface area, elongated in the

Test of case study

From the 81 pre-defined scenarios (See Appendix/download for all possible scenarios) created for this case study we chose two at random to attempt history matching. We history matched a set of models, with different parameterisations using the RAVEN assisted history matching software (www.Epistemy.com). A single set of history match results were produced for each case study using the in-built Particle Swarm Optimisation (PSO). NA Bayes was applied to the ensemble of models to create estimates

Conclusions

The Watt Field case study was designed to test the influence on history matching and uncertainty quantification of different parameterisations, structural models, interpretations and model building methods. The scenarios developed were designed to cover key uncertainties in the reservoir model and require the modeller to make decisions around which is the appropriate model selection. There are many choices to make in this benchmark problem and many ways in which the reservoir properties can be

Acknowledgements

We would like to thank JIP sponsors of the Uncertainty Project for their support in funding the ongoing work in uncertainty in Heriot-Watt, Schlumburger and Epistemy for providing the Petrel, Eclipse and Raven software and the Heriot-Watt IPE MSc courses and data sponsors of that course for providing some of the data used in this field. We greatly appreciate the review comments from Jef Caers and the other reviewer that helped to improve the paper and tie in ideas.

References (40)

  • L.A. Zadeh

    Toward a perception-based theory of probabilistic reasoning with imprecise probabilities

    Journal of Statistical Planning and Inference

    (2002)
  • Arnold, D., 2009. Geological Parameterisation of Petroleum Reservoir Models for Improved Uncertainty Quantification....
  • C.E. Bond et al.

    What do you think this is? Conceptual uncertainty in geoscience interpretation

    GSA Today

    (2007)
  • J. Caers

    Modeling Uncertainty in the Earth Sciences

    (2011)
  • Castro, S.A., Caers, J., Mukerji, T., 2005. The Stanford VI Reservoir.” 18th Annual Report. Stanford Center for...
  • Challenor P., Tokmakian, R., 2011. Modelling future climates. In: Christie, M., Cliffe, A., Dawid, P., Senn, S., (Ed.),...
  • M.A. Christie et al.

    Tenth SPE comparative solution project: a comparison of upscaling techniques

    SPE Reservoir Engineering and Evaluation

    (2001)
  • M.A. Christie et al.

    Error analysis and simulations of complex phenomena

    Los Alamos Science

    (2005)
  • Christie, M., Demyanov, V., Erbas, D., 2006. Uncertainty quantification for porous media flows. Journal of...
  • Demyanov, V., Christie, M., Subbey, S., 2004. Neighbourhood Algorithm with Geostatistical Simulations for Uncertainty...
  • Erbas, D., Christie, M., 2006. “How does Sampling Strategy Affect Uncertainty Estimations?,” Oil & Gas Science and...
  • D. Erbas et al.

    Effect of sampling strategies on prediction uncertainty estimation

    SPE

    (2007)
  • Evensen, G., Hove, J., Meisingset, H.C., Reiso, E., Seim, K.S., Espelid O., 2007. Using the EnKF for Assisted History...
  • F.J.T. Floris et al.

    Methods for quantifying the uncertainty of production forecasts: a comparative study

    Petroleum Geoscience

    (2001)
  • Hajizadeh, Y., Christie, M., Demyanov, V., 2009. Application of Differential Evolution as a New Method for Automatic...
  • Hajizadeh, Y., Christie, M., Demyanov, V, 2010. Comparative Study of Novel Population-Based Optimization Algorithms for...
  • Kim, Ch.-S., 2002. New Uncertainty Measures for Predicted Geological Properties from Seismic Attribute Calibration. In...
  • N. Liu et al.

    Assessment of uncertainty assessment methods

    SPE

    (2001)
  • Mariethoz, G., Renard, P., and Caers, J., 2010. Bayesian inverse problem and optimization with iterative spatial...
  • Cited by (56)

    • Efficient deep-learning-based history matching for fluvial channel reservoirs

      2022, Journal of Petroleum Science and Engineering
      Citation Excerpt :

      Gradient-based optimization techniques were popular to explore the global optimum due to its fast convergence; however, their solutions often suffered from a local optimum if a severe nonlinearity between static and dynamic data exists. For decades, history matching works have evaluated reservoir uncertainty in terms of the bandwidth of dynamic responses by applying stochastic methods to a suite of equiprobable reservoir models (Arnold et al., 2013; Suzuki and Caers, 2006). For example, multi-objective optimization schemes based on evolutionary strategies have been studied as a way to handle multiple and conflicting objectives in history matching problems (Hutahaean et al, 2016, 2017; Min et al., 2016).

    • Probabilistic characterization of subsurface stratigraphic configuration with modified random field approach

      2021, Engineering Geology
      Citation Excerpt :

      In petroleum geoscience, the stratigraphic configuration is often constructed with the aid of multidisciplinary expertise, and multi-source data (e.g., geological setting, borehole data, seismic data, and gravity data) are integrated (Caumon et al., 2009; Wu et al., 2015; Schweizer et al., 2017). The probabilistic approaches that are based on the Bayesian inference (Arnold et al., 2013&2019) and support vector machine (Jung et al., 2018) have been recently developed to characterize the stratigraphic uncertainty in petroleum and mining engineering. A potential limitation of these probabilistic approaches is that extensive manual interventions are required (Wu et al., 2015; Arnold et al., 2019).

    • Multi-objective optimization under uncertainty of geothermal reservoirs using experimental design-based proxy models

      2020, Geothermics
      Citation Excerpt :

      The following sections describe the nature of the Watt Field models, how we adapted the Watt Field to a geothermal problem and the operational design parameters of the doublet. The Watt Field (Arnold et al., 2013) has over 81 geological models that cover interpretational uncertainty integrated throughout the reservoir modeling workflow in a hierarchical way. The uncertainties include: different model grids, depth and shape of the reservoir's top surface, facies models, porosity and permeability models, and fault locations.

    View all citing articles on Scopus
    View full text