Time-lapse (4-D) seismic data has become an integral component of the production strategy for many hydrocarbon reservoirs around the world, and reservoir engineers have come to depend on the information it provides when making critical reservoir decisions. Results from new monitor surveys are frequently required soon after acquisition, but the processing of time-lapse seismic data often can be time-consuming, causing delays that reduce its value. Permanent time-lapse monitoring systems can deliver results in short timeframes, but not all fields can justify the financial outlay required for such systems. Advancements in acquisition technology, combined with careful attention to workflows, are enabling the routine delivery of time-lapse seismic results from marine towed-streamer surveys in similar timeframes to those achieved by permanent systems.

Turnaround challenges

Seismic datasets contain inherent variations between surveys that can obscure the desired time-lapse signal. Compensating for this variability while preserving genuine changes between datasets can be time-consuming, and data processing turnarounds of five or six months after acquisition completion are commonplace. For time-lapse results to be routinely available when needed, acquisition and processing workflows need to address some key issues:

  • Acquisition variability must be minimized;
  • Variability that the acquisition system cannot control should be measured;
  • Acquisition efficiency must be maximized; and
  • Data processing efficiency must be maximized.

Minimizing acquisition variability

Compensating for variations between seismic datasets is typically the most time-consuming and potentially inaccurate component of a time-lapse seismic processing flow, so it is highly beneficial to minimize variations during acquisition. Permanently emplaced recording systems have low variability and can be the best solution when short monitor survey repeat intervals are required; however, for most applications, towed-streamer systems are more cost-effective for time-lapse monitoring. Several components of the WesternGeco Q-Marine point-receiver marine seismic system, for example, have been designed specifically to minimize variations in the data recorded from one survey to the next.

The system’s calibrated marine source provides estimates of the airgun source signature for every shot, enabling compensation for variations in the signature and its source bubble within and between surveys. Figure 1 shows how accurate removal of the bubble pulse reduces background noise and enhances the time-lapse signal.

The streamer contains calibrated point-receiver hydrophones at 3.125-m (10.2-ft) intervals. Initial processing algorithms compensate for any out-of-specification hydrophones and can accurately emulate the receiver array response of conventional surveys, which typically record data from groups of hydrophones at 12.5-m (41-ft) intervals.

Automated source and streamer steering systems enable the positioning of in-sea equipment to be accurately repeated between surveys, reducing variations in the data that can result from differences in position.

Measuring acquisition variability

Waves and variations in tide height and water column velocity impact time-lapse seismic repeatability. The system provides functionality to measure tides, wave heights, and water velocities during acquisition, enabling deterministic compensation during processing. Application of these deterministic corrections is substantially less time-consuming and error-prone than the statistical compensation processes that must otherwise be employed.

Maximizing acquisition efficiency

Time-lapse seismic acquisition tends to be less efficient than conventional acquisition. It often is desirable to repeat lines at the same point in the tide cycle as previously acquired, which requires careful planning and execution. Producing fields tend to be crowded, and noisy environments, field operations, and other activities such as fishing can limit the time window in which a monitor survey can be acquired. The ability to evaluate the effect of noise and noise attenuation processes can improve acquisition efficiency by enabling the acceptance of noise-contaminated data and prioritizing data that need to be reacquired.

Most modern seismic acquisition vessels have substantial onboard computing capability that can apply the time-lapse seismic processing flow and build a 3-D data volume as acquisition proceeds. The 3-D volumes can provide valuable fast-track time-lapse seismic products that are delivered for interpretation and analysis immediately after the completion of acquisition. The 3-D volumes contribute to effective quality control (QC). Time-lapse difference volumes and plots of 4-D QC attributes can be created at intervals throughout a survey. This allows the effect of factors such as noise or suboptimal acquisition repeatability to be objectively evaluated using high-quality time-lapse difference analysis (Figure 2), enabling in-time decisions to accept or reshoot data. Appropriate QC plots also allow errors or inconsistencies in acquisition and processing to be identified and rectified at an early stage.

Figure 3 is a QC map created during acquisition of a monitor survey that shows a consistent phase rotation of about 6° between the newly acquired survey and a previous survey. If computed on a single line, this small phase rotation would probably not be noticed. The rotation was related to the use of a slightly different field filter impulse response during onboard processing of the monitor survey. Early identification of the problem enabled rapid correction and avoided delays to the project.

Maximizing processing efficiency

It is common practice for all the vintages of a time-lapse seismic project to be reprocessed simultaneously for the variability of the individual datasets to be consistently compensated by statistical means. Duplicating the processing of previous datasets is a considerable overhead that can be avoided by minimizing acquisition variability and carefully using designed workflows for the new monitor survey. To ensure compatibility with previous datasets without the need for reprocessing, the design of workflows for the new survey must address two key issues:

  • How to consistently apply any remaining statistical corrections; and
  • How to ensure that the current hardware/software configuration generates equivalent results to those previously obtained.

The first issue is addressed by using a “reference dataset” approach to correct for incompatibilities such as time-shifts, amplitude differences, and variations in acquisition coverage (“4-D binning”). Application of this approach to the process used to compensate for time-shifts between sail lines is shown schematically in Figure 4. At some point in the past the previous surveys will have been coprocessed and line-by-line time-shifts computed. These shifts are applied to the previous surveys, which are then combined to create a reference dataset. Time-shifts are then recomputed by comparing each individual survey with the reference dataset. A subsequent monitor survey can be independently compared with the reference dataset to derive sail line by sail line time-shifts that are consistent with the previous surveys.

Regression testing is used to ensure that the current hardware/software configuration gives equivalent results to that used previously. During the previous processing, example input and output datasets for each processing step are archived. Shortly before the new acquisition is to start, the archived input datasets are reprocessed using the current hardware/software configuration and the results are compared with those previously archived. If the results differ significantly, the processing algorithms responsible are identified and customized to operate in the same way as before.

On-time results

Time-lapse seismic data are a valuable tool in optimizing field production. However, their value depends upon on-time delivery of the results from new monitor surveys and ensuring compatibility with previous surveys.

Since 2008, implementation of workflows has enabled final results to be routinely delivered four to 12 weeks after acquisition completion, depending on project size and complexity.

Acknowledgment

WesternGeco thanks Statoil for permission to show the data examples presented here.