By bringing together the traditionally separate workflows of seismic interpretation and reservoir modeling, a powerful workflow is being created that spans the reservoir lifecycle from exploration through to production.
Too often in the past reservoir modeling has been seen as something “nice to have” — a
|Figure 1. A six-step workflow spans the reservoir lifecycle from exploration through to production. (Image courtesy of Roxar)
process of refining the structural and stratigraphic models and conducting “what if” scenarios before reservoir simulation.
And reservoir engineers have sometimes only had themselves to blame, relying on manipulated and interpreted data — often from indirect sources — and building a bias into their models toward the production end. Too often, no meat (high quality data) was on the bones (modeling analysis tools) of reservoir modeling.
The times are changing, however.
Reservoir modeling is taking control of the data within its modeling framework and is becoming the central, critical defining workflow within the exploration and production (E&P) industry. Reservoir modeling today is a mainstream discipline that gives equal priority and focus to geological, geophysical and production data.
The bottom line for operators is a better understanding of all aspects of the data and how the data impact their reservoir understanding. The results are improved decision-making and increasing returns on human and asset capital investments made in the E&P process.
A six-step process
Modeling has a simple definition — a definition that all too often people fail to understand. It is, ”The process of describing a system, process or phenomenon that accounts for its known or inferred properties which in turn can be used for simulating and predicting results.”
Roxar and Geomodeling believe that six steps define today’s mainstream reservoir modeling workflow: Find, Define, Simulate, Predict, Evaluate and Decide.
From an exploration starting point right through to production, these steps have turned reservoir modeling from a stand-alone discipline into an expanded, integrated and seamless workflow that covers seismic interpretation, geological interpretation, reservoir character-
ization and reservoir modeling.
The results are more effective simulation, prediction and evaluation of results, from which the key decisions are made.
Find and define — going seismic
Reservoir modeling is only as good as the data and assumptions that go into it, which is why Find and Define are so important in reservoir modeling.
The Find step represents exploration, where new techniques are used to discover reservoirs
||Figure 2. Composite display demonstrating how different seismic datasets can be just by using different present-
ations: Multiattribute display (relative acoustic impedance, seismic, semblance); spectral decomposition cubes (70Hz, 20Hz, 50Hz, 30Hz); logs and tops along the well paths. (Data from Stratton Sesimic Cube, figure 2 and 3 courtesy of Geomodeling)
with the basic structural and stratigraphic framework present at the outset. And the Define step focuses on defining the reservoir through methods such as seismic reservoir characterization, including spectral decomposition, facies classification, cross-plotting
and strata-grid analysis.
Both steps are concerned with extrapolating geological and geophysical data from observed values, which are then used in the later modeling phases.
Although the introduction of 3-D seismic into reservoir models started some years ago, only the last few years have seen the development of a series of rigorous seismic interpretation and analysis techniques, which now form an integral part of modeling and, in particular, the Define step.
Seismic Interpretation – Spectral Decomposition. Spectral decomposition is a highly effective seismic interpretation and analysis process based on the concept that a thin bed reflection has a unique spectral response in the frequency domain.
It enables the detection of thin beds and other geological features such as subtle faults that are beyond the capabilities of traditional seismic analysis. This is achieved by breaking down the seismic signal into a range of frequencies. There is no single method for spectral decomposition; rather, it is a series of techniques that have different frequency resolutions.
By generating amplitude and phase maps tuned to specific frequencies, the resulting spectral amplitude maps help estimate bed thicknesses, and the phase maps define lateral stratigraphic discontinuities.
By mapping areas where anomalous
frequency signals are identified, operators can detect the presence of hydrocarbon
fluids. Comparisons of spectral decomposition results to well data can also greatly improve the prediction of bed thicknesses and hydrocarbon-bearing zones away from wells to help determine the best well placement areas.
Seismic Interpretation – Facies Classification. Seismic facies analysis and classification are essential tools for multi-attribute analysis and interpretation, transforming the initial seismic data into understandable geological information.
They can be applied to horizon, interval, volume, strata-grid and trace displays to help
|Figure 3. A 40Hz spectral decomposition display for inline 87 on the left, amplitude spectrum (10-80Hz) for trace 180 on the right. Circled areas indicate bigger amplitudes along the inline or within the frequency range. Colormap is below to indicate the amplitude range. (Data from Stratton Sesimic Cube)
improve fluid prediction and accuracy, and they are particularly applicable to property modeling.
Horizon attributes such as phase, semblance and curvature, and volume attributes such as waveform difference and spectral decomposition, can all be entered into the facies classification maps to enhance stratigraphic detail. Through cross-plotting, the classification maps can then be used to correlate from logs to seismic, from seismic to attributes and with other combinations.
Seismic Interpretation – Strata-Grid Analysis and Cross-Plotting. Finally, there are strata-grid analysis and cross-plotting — both key interpretation and analysis tools.
Strata-grid analysis is a highly effective tool for building a stratigraphic model of the reservoir from seismic attribute volumes and horizons. It brings together the different attribute analysis workflows. Principal component analysis, facies classification and waveform correlation analysis can all be applied to the stratigraphic grids to detect subtle stratigraphic prospects and to fine-tune the reservoir model.
Cross-plotting enables professionals to correlate seismic attribute, well and facies data with the geology of a reservoir. Analysis can be conducted in all dimensions and at all levels — from horizon to grid to volume. Cross-plotting multiple seismic attributes greatly improves well correlation and can reveal hydrocarbon indicators that are not visible in conventional log or seismic analysis.
Ignore the Geoscience at Your Peril. Many examples of past reservoir models have ignored the Find and Define steps, with negative impacts on production.
In the North Sea, for example, reserves have increased or decreased by more than 50% (up to 200%) in more than 40% of the fields in the 10 years following field development plan approval.
Many of these fields were based on predominantly deterministic reservoir models, often with tenuous links to the geology. The negative outcome for the operator was flawed assessments of financial rewards or losses and a costly adjustment of well numbers and facilities.
Simulate, predict, evaluate and decide
Whereas Find and Define are classic steps in seismic interpretation and reservoir characterization, the Simulation, Prediction and Evaluation steps are where the geoscience merges with the production data. This is a crucial interface in today’s reservoir modeling.
Historically, production modeling tended to be well-log based and focused predominantly on faults and horizons. When seismic is introduced earlier in the process, however, users are dealing with a richer and more powerful model.
The benefits of this can be seen in each of the final four phases — for example, in the Simulation phase, where multiple geological scenarios can be simulated to understand how the sedimentary structures impact fluid flow, and in the Prediction phase, where property mapping allows users to map and predict geological horizons and faults in depth or time.
In the Evaluation step, the different hypotheses and individual predictions can be evaluated. Facies models, and an understanding of the influence of facies distribution on fluid flow, come to the fore.
Finally comes the Decide step. Operators use the information provided in the first five steps to make those crucial reservoir management decisions, whether they are bid valuations, new field development and operational plans, or production estimates and divestments.
Linking to production
Having moved through the seismic stage, Simulation, Prediction, and Evaluation are closely aligned with production. From structural modeling and property mapping to log management and well correlation through to well planning and fault seal analysis, 3-D modeling brings uncertainty management to the entire reservoir workflow.
Uncertainties in depth conversion, structural modeling, geological property modeling and dynamic reservoir simulation all need to be simultaneously evaluated, ensuring that the full impact of these often-independent uncertainties is captured through realistic 3-D static and dynamic reservoir models.
Because the models are rooted in the geoscience and the data built up at the initial seismic stage, it is that much more accurate and effective to integrate all available data, including seismic, well log and other geological data, and attempt to quantify structural and reservoir property uncertainties.
Closing the loop
Today, Geomodeling’s seismic interpretation software, VisualVoxAt, and Roxar’s reservoir modeling workflow, IRAP RMS, are combining to take the reservoir modeling process from seismic visualization, mapping and correlation right through to geological modeling and ending with reservoir simulation and well planning.
And, whereas current modeling technologies only offer “what if” types of analyses —allowing for multiple realizations of what would happen if a well was choked back, for example — companies such as Geomodeling and Roxar are looking to close the loop through integrated reservoir and production management (iRPm).
iRPm is essentially the back end of the loop, tying production operations into the modeling process through the complete integration of modeling and simulation with real-time data from the field.
Sourced through real-time reservoir pressure, temperature and flow rate monitoring instruments, this data is used to rapidly update the reservoir model, change production profiles quickly and cost-effectively without the need for intervention, and ensure the most recent up-to-date information from the field is available when forecasting future reservoir performance and making operational decisions.
Hence, both the geoscience and production data are incorporated into the reservoir model.
Reservoir modeling goes mainstream
Reservoir modeling today is on the cusp of an exciting new era, having the potential to become the single most important workflow within the upstream E&P industry.
By aligning the geological and geophysical data with the production data and integrating the seismic and modeling workflows, reservoir modeling is going mainstream — a solution that can take the operator from exploration right through to production.
Finally, meat is on the bones of reservoir modeling.
Editor’s Note: In June 2007, Roxar announced a multiyear agreement with Geomodeling for the resale of its seismic interpretation software, VisualVoxAt.