As budgets and margins remain tight, making optimal field development decisions throughout the entire reservoir life cycle has never been more important. Development choices will affect final production, and it is therefore crucial to ensure that any decision is properly supported by the right information.

Unlocking and propagating uncertainties

It’s crucial that any reservoir characterization and modeling process realistically represents the underlying seismic data. Furthermore, any model that oversimplifies geological complexities is not going to deliver the vital information operators require today. The incorporation of uncertainty as early as possible in the reservoir characterization process is therefore a must-have.

However, too often uncertainties are neglected. Overlooking the uncertainty inherent to reservoirs’ static and dynamic behaviors may trigger incomplete or misleading interpretations of the available data.

Moreover, substantial time and capital are spent performing history-matching for better predictions and a better field development strategy. However, because of reservoir uncertainties large numbers of parameters are modified during this process, and multiple modifications of model properties make prediction models inconsistent with the actual reservoir geology.

It’s with these critical aspects in mind that Emerson has developed the Big Loop workflow. A cornerstone of Emerson’s reservoir characterization and modeling workflow, Big Loop tightly integrates static and dynamic domains and offers the propagation of uncertainties from seismic characterization through to geological modeling and simulation.

This means that reservoir uncertainties are captured and varied as input parameters, creating an ensemble of realistic reservoir models that all feed into the reservoir simulator. This leads to a better understanding of the reservoir geometry, more robust reserves estimations and better informed decisions for future field development scenarios.

Capturing uncertainties while interpreting

Part of this workflow encompasses Emerson’s model-driven interpretation (MDI) capabilities, where the uncertainty is captured during the seismic interpretation process. The MDI can powerfully handle multiple instances of the same model, capture the limitations of the data and quantify geological risk correlated to the seismic.

Offshore the Middle East, an operator used the MDI approach to quantify gross rock volume uncertainty. The reservoir in question was in the appraisal/early development stage and had nine wells unevenly distributed across the field, not all of which had penetrated the bottom of the reservoir. The result was improved gross rock volume uncertainty, valuable input into field appraisal and development plans, and reduced risk.

The latest version of Emerson’s reservoir modeling software bridges the gap farther between seismic interpretation and geological modeling with a “snap-to-seismic” feature for better interpretation reliability and quality control. With this algorithm and with a click of a button, the model is conditioned to the seismic data via a waveform similarity metric. This gives users the ability to track characteristics of a seismic event across the domain of interest.

The capturing of uncertainties as early as possible in the workflow is crucial, but these uncertainties also need to be propagated along the workflow and transferred to dynamic modeling. Although various approaches are available, they very often require an early ranking of models, either static or dynamic. However, anchoring to one or a few selected models can lead to disappointing results.

Reliable forecasts that honor geology

It’s for this reason that Emerson has developed ensemble-based history-matching and forecasting capabilities that provide operators with comprehensive uncertainty quantification and management.

Hence, uncertainties can be added in the domain where they belong (e.g., static) and propagated to where they matter (e.g., production). Forecasts are based on an ensemble of reservoir models, enabling decisions to be made under uncertainty and managing and mitigating risks in the most comprehensive way.

Moreover, once the workflow has been set up, Big Loop can be run automatically to produce as many realizations and simulation runs as needed. The process outcomes can then be adopted to understand the sensitivity and interplay of the many parameters involved.

On a geologically complex field in the North Sea the Big Loop workflow led to a geological and dynamic reservoir model that stayed fully synchronized throughout the lifetime of the field. The propagation of uncertainties throughout the workflow also produced geologically consistent history-matched ensembles that determined what influences there were on the amount of initial oil in place in the model, leading to optimized production and improved decision-making.

Ensemble-based forecasts enable users to evaluate multiple prediction scenarios based on models comprehensively encompassing uncertainties and thus remaining fully consistent with the underlying geology. (Source: Emerson Automation Solutions)

An integrated seismic-to-simulation workflow

One of the main criticisms of reservoir characterization and modeling workflows in the past has been their fragmented nature and the limited collaboration between asset teams. The process tended to be proprietarily dominated among different domains with vital data often overlooked, model updates difficult to achieve and domain-specific working practices taking precedence over reservoir characterization goals.

Such an approach is no longer sustainable if a company is to get the best out of both its reservoir model and its skilled staff.

Today’s best practices are promoting strong links between seismic interpretation, geological modeling and reservoir simulation via data integration, shared tools and asset team collaboration.

Software interoperability also is linked to the issue of integration, where transferring data between software packages to preserve critical information is vital. The latest version of Emerson’s reservoir modeling software, RMS, has the capability to transfer data from the Petrel software platform in a one-step procedure, enabling ease of use and adding onto current interoperability already available.

Another tool aiding in integration and flexibility is the Roxar application programming interface (API). Using Python, a powerful but simple programming language, the API enables operators to integrate their own intellectual property into reservoir characterization and modeling workflows.

Applications also can be written or extended to access RMS project data, and the API can be used to build customized standalone programs. This opens a range of new possibilities for subsurface workflows, enabling users to include company-specific goals in their proprietary workflows and develop proprietary, commercial or open solutions.

Making informed decisions

When it comes to a successful reservoir characterization workflow, encompassing all the uncertainties linked to data and modeling processes in a comprehensive and reliable way is vital, especially when dealing with data from multiple sources, scales and possible interpretations.

Managing uncertainty from seismic to production and ensuring integration and flexibility at the heart of decision- making will go a long way toward unlocking the value of reservoir assets now and into the future.