Amplitude correction derived from two acquisition geometries. The 4-D section is contaminated by false-positive signals if this is not considered. (Figures courtesy of Geotrace)

Time-lapse (4-D) seismic technologies are definitely providing their worth as reservoir management tools through all stages of an oil field's lifecycle. More is being discovered about the reservoir by combining 4-D data with other traditional data. Using both types of technology provides a comprehensive picture of the reservoir and reduces risk.

While 4-D seismic technology has its advantages, it also presents a unique set of challenges.

Non-repeatability

To improve repeatability, all causes that interfere with it must be evaluated. Ways to eliminate it also must be assessed. The most common causes of non-repeatability include acquisition footprint problems such as the use of different equipment; varied source-receiver geometry; and inconsistent and incorrect positioning; as well as environmental conditions like tides, temperatures for marine and weathered zones for land, and noise —whether it’s coherent or random.

There are several solutions to the challenges of achieving repeatability, including tidal statics using oceanographic tide charts, proper true amplitude migrations with acquisition compensation weights (Figure 1) and 4-D-friendly signal-to-noise ratio (SNR) enhancement techniques including morphing (Figure 2).

High-frequency imaging

Another challenge of 4-D technology is obtaining high-frequency images to reflect and quantify time-lapse effects. This is particularly significant with continuous evaluation where repeat surveys are obtained annually. Several things must be considered, including acquisition standardization, global cross-equalization techniques and equalization quality control (QC).

Under acquisition standardization, it’s important to consider the following:

Amplitudes. Surface-consistent scaling within each survey is achieved using a general-purpose scaling technique that smoothes inter-ensemble amplitude variations. Typically, the following compensations are made:

  • Source variability in multisource acquisition;
  • channel and shot variations within a line; and
  • crossline variations.

In this technique, ensembles are scaled according to the average level of the data in the ensemble’s vicinity. The root mean square (RMS) level of each ensemble is computed over a user-specifiable time gate and compared to a running average and a normalizing computed scalar. Traces exhibiting anomalous high or low amplitudes may be excluded from the ensemble RMS calculations by taking trimmed means.

Source wavelet. Convolutional filters are designed to shape the far-field signatures of both surveys to a zero phase spectrum. If a suitable far-field is not available, normalization of field filter responses is performed.

Cable depth. The locations of the notches in the frequency spectrum produced by the differences in the cable depth for the two surveys (base and monitor) are used to correct the discrepancy by means of a convolution filter design by using smoothed, modeled notches.

Global cross-equalization techniques include:

Frequency equalization. A match filter is designed to normalize the monitor survey effective estimated wavelet to that of the base survey using amplitude spectra from a selected time gate averaged over identical common midpoint (CMP) locations near to, but excluding, the reservoir location. Averaging should be sufficient to remove any spectral components due to geology because the two surveys may contain local residual timing or positional differences.

Phase equalization. A phase match filter is designed using the spatial average of cross-correlations between the base and monitor surveys over a selected time gate from traces at identical CMP locations near to, but excluding, the reservoir location. The cross-correlation analysis can yield either explicit phase corrections for each frequency or an effective time shift and average phase rotation.

Amplitude equalization. A global scalar is designed to normalize the monitor survey amplitudes to that of the base survey using average RMS amplitudes over selected time gates from traces at identical CMP locations near to, but excluding, the reservoir location. Alternatively, amplitude equalization may be achieved during frequency equalization if absolute frequency values are honored.

Residual mis-positioning (registration) analysis. A 3-D cross-correlation function between base and monitor surveys is calculated, then averaged over a selected time window at multiple locations throughout the survey. Peak values are interpolated and shifts determined to sub-sample accuracy together with measures of statistical validity. Analysis results are displayed as onscreen maps. Multiple regression analyses are performed on the shifts to generate final shifts and the optional rotation angle.

4-D morphing

The 4-D morphing shown in Figure 2 provides the most dependable way to compute 4-D signals today. Once repeatability has been achieved, 4-D morphing allows the effective separation of the amplitude changes from the travel-time changes in a much more precise way than the older cross correlation-based techniques. These 4-D amplitude and time-shift characteristics are the basis for quantitative analysis of reservoir properties and pressure as well as compaction, geomechanics and well stability.

What do petroleum engineers want?

There are a number of relevant technologies to assist petroleum engineers in their quest for optimizing reservoir assets. The opportunities are there for:

  • 4-D HFI processing to detect potential flow barriers and pathways (micro-faulting, baffle zones, fracture, anisotropic orientation, etc.);
  • High-resolution (HR) prestack elastic inversion to describe reservoir static and dynamic rock and fluid properties;
  • High-density, high-resolution (HDHR) pore pressure prediction to understand the pressure changes of reservoirs and their depletion process (Figure 5);
  • Geomechanical study to evaluate the production effect in terms of compaction and dilation of reservoirs and surrounding rock formations (above, below and on both sides) since the influence can be far-reaching in distance from the well bore; and
  • Production and cash flow management though the reservoir model, reservoir simulation and history matching to better understand the interaction between rock and fluid properties, injection location and volume, and production volumes of different fluids.

The challenge for the geophysicists is to produce consistent results in a timely manner that eliminate changes that are not caused by production effects.

Engineers need to know from the seismic analysis what and where the flow units are at the start of production, and where the possible permeability barriers might exist.

Secondly, they need to know what is swept (depleted reservoir) and what is un-swept (virgin reservoir) and also be aware of any fluid phase changes.

The engineer’s simulation tools are required to be predictive for reservoir management and cash flow predictions. When it is successful, geophysics generally describes what is current at the time of any particular survey acquisition time.

In most cases, the value of geophysically derived dynamic reservoir information is reduced as a function of time from the moment of acquisition. Being able to evaluate those data quickly is vital for proper field management. The reservoir is a most dynamic environment, and engineers need to manage the reservoir rapidly in order to keep profitability optimized.

Conclusion

All indications are that 4-D techniques are becoming established and eventually routine for many companies with large volumes of offshore or onshore assets. Many of these companies are already or soon will be short of oil reserves. Enhancing recovery by using time-lapse technology has helped to increase reservoir recovery factors significantly. Many of the major oil companies’ decisions that have been aided by 4-D input have already paid handsome dividends.