The oil industry is at the brink of the “perfect storm.” Easy oil is gone, and in its place we’re left with harder-to-find reservoirs that are structurally and geologically more complex and
 
Figure 1. A new approach enables a highly faulted reservoir model such as this one to be constructed in less than an hour. (Image courtesy of Paradigm)  
therefore more difficult to understand. Even existing fields must be accurately characterized to enable extracting bypassed hydrocarbons. Data is being collected at the fastest rate in history while we lack the resources (software and people) to properly analyze it or add it to the model we completed years ago. At the same time, the exploration and production (E&P) industry is awash in interpretation and modeling applications that are difficult to use, poorly integrated and fall short of providing satisfactory solutions for ever-deeper and complex reservoirs. Add to this equation the shortage of qualified talent and ever-decreasing time allocated to support reservoir management decisions, and the “perfect storm” emerges.

How do we survive?
We must start with the premise that the ultimate integration that can exist between the geosciences and reservoir engineering is based on the concept of a shared earth model. The central piece (at the reservoir scale) is the reservoir model resulting from the integration of geophysical (interpretation and attribute) data, petrophysical (logs), geological (conceptual, sedimentological), and engineering (production). It will be used to understand and predict reservoir behavior. A current drawback is that constructing such a model can be very time-consuming. It often must be done by the few “modelers,” modeling software experts with multidisciplinary knowledge, for whom there is unfortunately no obvious training or career track either in industry or in university. The cost in time of training geoscientists and engineers during the course of a project is often too large to support the effort. As a result, we need tools that enable anyone to construct accurate reservoir models for fast turnaround decisions.

On the technical side, current state-of-the-art reservoir modeling technology revolves around the construction of a 3-D “corner-point” geometry stratigraphic grid. This grid is equally used by geologists for geostatistical property modeling and volumetric analysis and by reservoir engineers for flow simulation and reserve assessment. Their construction is dictated by the so-called pillar or extrusion approach that works well in simple, vertically faulted, layer-cake topologies. However, when even a small amount of structural complexity (e.g., multi-z, reverse or y- faults) is added, currently available modeling applications create severe distortion of cells near faults. Very often the reservoir grid simply cannot be constructed. This is a problem that forces users to “fudge” their geological model. Common work-arounds include simplifying or removing the offending interpreted fault data so that the modeling application can be accommodated. The application can now proceed, but the resulting model is simplified, often beyond recognition.

Furthermore, deciding on which faults to remove or “verticalize” can be extremely time- consuming. As a result, the reservoir model no longer accurately reflects the interpreted underlying geology or the data that supports it. Fault-blocks are not adequately represented, which impacts the compartmentalization and connectivity of the reservoir model. Geological distances and constant volume assumptions of geostatistical algorithms are violated due to the deformation of the cells as soon as faults are not sub-vertical. Therefore, the correlations imposed on the facies, porosity and permeability models will be wrong. With a bad model going into the reservoir flow simulator, history-matching results can only be erroneous and definitely not predictive.

The large amount of money that was spent collecting and interpreting the data has now been marginalized by the “fudging” of the model. Data that was difficult to incorporate was ignored due to the time needed to include it into the model. Updating the models with new information or changes in interpretation was a long process as we had to call the consulting modeling expert back and start the model building over again. So we didn’t do that very often either. Thinking of alternative interpretation, geological or production scenarios to understand the associated uncertainty and manage our risks was out of the question (who has the time?).

And we wonder why the production forecasts were wrong?

Riding the perfect storm

There is a new approach. Instead of calling on the expert, we feel any geoscientist or engineer should be able to construct models without being overwhelmed by the idiosyncrasies of the modeling software. They need to focus on the science, not the buttons. Users should be guided through mostly automated processes, focusing at each step only on decisions that require geological and engineering judgments.

Instead of removing or simplifying data, we encourage the use of all available interpretation data (even when there are hundreds of faults). We also reject the idea of “fudging” the answer to make model building faster or easier. Using all the information available ensures confidence in the accuracy of the model and the answers extracted from it and provides better support for decision-making.

Instead of using the same grid format for both geostatistical property modeling and reservoir flow simulation, we use one adapted for each case, honoring all available data and also the constraints of each discipline. The “geological grid” is no longer a victim of the pillar nightmare, and geological distances are respected. The “simulation grid” satisfies this discipline’s requirements yet accounts for the total complexity of the structural geology. These two grids are intrinsically linked, ensuring an accurate upscaling of reservoir properties.

Instead of spending days to months constructing a single reservoir model, new technology developed by Paradigm makes it possible for projects to be completed in hours. Recent benchmarks found reductions in modeling time by factors of 50 to 100 times using every piece of data available and no “fudging.”

With modeling so easily available to all geosciences domains, the earth model can now become a true “live document” that engineers and geoscientists can together share, edit, refine or derive into several alternatives.

Conclusion
Three-D reservoir modeling must become a commodity. It is a foundation piece to the shared-earth model and should be to any reservoir management decision. It cannot be the reserved playground of a select few. Its construction must not be cumbersome and constrained by software limitations.
The modeling revolution has arrived.