Whether a company is trying to get more oil out of the ground by exploring for new reservoirs or improving production from existing ones, the challenge remains the same. Operators must build the most accurate models possible by integrating disparate data from a few widely spaced sources. Through a combination of skill, experience, technology and more than a little luck, the oil industry has fueled the engines of progress for almost 150 years. And every day the challenge becomes more difficult.

Read between the lines

Knowing what’s in the wells is not the problem. The challenge is knowing what’s between the wells. Early explorationists relied on “close-ology”— meaning the best place to find oil is to look for it where it is. Many giant fields of the past were developed that way, and seismic helped by delineating the structures beneath the surface so the most logical appraisal and development plans could be designed. Seismic was an exploration tool of the first order.

Today seismic has evolved from an imaging tool to a comprehensive reservoir

Figure 1. Dynamic seismic reservoir chara-
cterization workflow. (Figures courtesy of Schlumberger)
characterization instrument. It allows operators to find oil in places in the reservoir they didn’t know existed, or places that were not flushed properly by previous production. Technology has been the enabling force behind this renaissance. Better seismic acquisition techniques, higher resolution data and faster and more accurate processing are taking their place beside better rock physics models to derive estimates of porosity and saturation of unprecedented detail — between the wells. We can do things today we could never do before, and it’s paying off.

The evolution

Because seismic is the only three-dimensional dataset, it was the logical choice as the base line on which to build the whole earth model, enhancing interdisciplinary communication. We’re seeing reservoir engineers use the models to predict drainage patterns and design injection strategies, and drilling and production engineers use them to help plan well trajectories or populate reservoir models.

The new reservoir modeling tools, like Petrel seismic-to-simulation software, enable different disciplines to derive benefit from seismic data and obtain a synergistic result. When people of different disciplines start building common models that respect both the geomechanics and seismic data, the final model is more accurate and more useful than either of the two individual inputs.

For example, a well might be producing 100 b/d of oil, and following a stimulation treatment, it flows 200 b/d. In the past, the engineer might go home happy, brimming with self-congratulation for doubling production. Now, with quality modeling, the engineer may realize the actual potential of that well to be 1,000 b/d, allowing the engineer to more confidently pursue further production gains. We are capturing the experience of the experts in our workflows and using them to our advantage.

Seeking ground truth
Scientists are continually seeking “ground truth,” and what that is depends on their backgrounds. For geophysicists it’s seismic, for petrophysicists it’s logs and for reservoir engineers it’s the information they derive from history-matching. But in the inter-well space, seismic is the only ground truth that exists today. Modern, practical workflows allow a wide segment of the technical community to use the information that seismic provides and not get bogged down with terminology.

With more accurate inter-well data, projections from micro-scale measurements like well logs can be made with greater confidence. Observations from macro-scale measurements such as pressure-transient analyses can be interpreted with a higher degree of accuracy so the dynamic behavior of the reservoir can be understood and predicted. When the analyst understands what the structure looks like, it improves understanding of what the pressure transient is indicating.

In challenging environments like the Barnett Shale in North and West Texas, for example, it helps us locate the natural fracture patterns where the gas will be most productive. The technology, called “ant-tracking,” is a kriging technique that allows us to map the fracture network. Later, when the wells are hydraulically fractured, micro-seismic techniques are being used to track the propagation of the fractures to ensure maximum reservoir contact is achieved.

In practice
New workflows are opening up opportunities for seismic to be a bigger contributor to
Figure 2. Seismic reservoir modeling is a sophisticated, seismically consistent, geostatically rigorous means of generating high-frequency reservoir models.
understanding the dynamic reservoir. The workflow starts with two main phases (Figure 1). The first involves seismic data conditioning to prepare the seismic for inversion, and the second phase uses well conditioning to prepare the well data to be integrated with the seismic data. Both steps come together in a four-stage iterative process involving wavelet extraction (the fundamental tie between well and seismic), inversion (the inversion engine itself), rock physics (the physical relationship between seismic and reservoir properties) and finally the reservoir modeling that builds the required reservoir properties (Figure 2).

A recent example where seismic properties were successfully translated into reservoir properties is found in the proximity of Pemex’s Arenque and Lobina fields. A marine carbonate play, Lobina field is a relatively new discovery first identified in 2003 near the prolific Arenque field, producing since the late 1960s.

Many wells had been drilled, and excellent log data existed, but inter-well gaps in the data
Figure 3. A porosity map of the Arenque/Lobina field allowed Pemex engineers to choose the high-potential Merluza location over the Caviar prospect. More importantly, it revealed a large area of high potential prospects (red ellipse) that had never been drilled.
remained a mystery. An acoustic impedance- to- porosity relationship was used to calibrate seismic data for use in a broad area application to create a porosity map from the newly acquired seismic data. A porosity-height attribute was derived and used to prioritize future drilling locations. Initially, the process was used to choose between two locations with proven success as the highest ranked location was drilled and tested at 2,000 b/d of oil, opening up a significant step-out location between the two fields. It also had greater, far more valuable implications, as the resulting porosity map revealed new potential reservoir bodies in an undrilled area of the field (Figure 3).

The future
Seismic, one of the first successful exploration tools, has come full-circle and is demonstrating its intrinsic value in helping us keep pace with growing energy demand. The recent advances in reservoir seismic characterization technology have translated into real value, not just in the exploration phase but throughout the life of the field. As other members of the asset team become more comfortable with this technology we will start to see new applications of seismic within the workflows of the geologist, the drilling engineer, the production engineer and the reservoir engineer. As this happens, the value derived from seismic data will “snowball” and lead to exciting new developments in the science of seismic reservoir characterization.