Reprocessing brings new ideas to light in older data.

The reworking of previously analyzed seismic datasets or reprocessing has always played a significant part in the makeup of a seismic contractor's workload.
The type of datasets being reworked runs the gamut from old onshore 2D analog data from central Asia through recently acquired marine 3D surveys from the Gulf of Mexico and the North Sea. This activity has always been an important revenue source for geophysical contractors, particularly when new acquisition activity is depressed. It is difficult to generalize what percentage of total processing revenues comes from reprocessing, as temporal and geographic variance is substantial. On average, it probably exceeds 50% of the total processing budgets in the industry.

Why reprocess?
Why is seismic data reprocessed? Sometimes, it must seem the first pass of processing has just been completed, then the reprocessing starts. Was there something wrong with the processing the first time? Will the new analyses provide the right answer this time? As with many things in this world, the answer is multifaceted and time-variant. Technology, the price of oil, new exploration plays, data ownership changes, merger and acquisition activity, politics and market forces all play a role.
Any processing project is a compromise between time, money, desires and objectives, often acting to the exclusion of the optimal results. As demands on data increase, objectives change and economics become more attractive, data is reprocessed to meet new standards.
Technology is the overriding factor. Data processing capabilities advance in leaps and bounds, making what is almost impossible today commonplace tomorrow. During the past 10 years, all the metrics used to describe computing facilities have increased from mega and giga to giga and tera, a tremendous increase in computer power. Powerful new graphics hardware and visualization technology has revolutionized our ability to view, analyze and interpret seismic data. The demands have changed, and older realizations of seismic volumes may not measure up. In comparison, basic acquisition technology is slow to change (efficiency excepted). The reprocessing of existing datasets is common.
What are the trends, and how are they affecting our industry? The application of new technology to existing data is the prime motivation for seismic reprocessing projects. In many cases the technology is not a revolutionary idea or concept, but rather the economics have changed. 3D prestack depth migration (PSDM) has been around for a long time, but it is only in recent years that large-scale projects have become feasible.
Computer technology, in terms of hardware and software, is the driving force. Old algorithms become cost-effective, and the promise of even better price performance allows the consideration of better algorithms. While new processing technologies sometimes require new data, most often they do not. New information is extracted from old data.

Depth migration
The subsalt plays in the deepwater Gulf of Mexico have made 3D PSDM vital to the exploration process; most deepwater projects require depth migration before drilling. A comparison of a 3D PSDM result with the initial, time-processed data reveals dramatic benefits. The chaotic reflectors beneath the salt body are significantly more interpretable. The structural picture is simplified and more accurately represents the subsurface. The net effect of intensive depth migration activity is enhanced efficiencies, leading to lower costs, which leads to larger areas for reprocessing.
In many geologic environments, the inclusion of elastic parameters into data processing flows is allowing geoscientists to address anisotropy. The ability to flatten common midpoint gathers can be significantly improved. These corrections while improving stack response are of even greater significance to amplitude variations with offset and amplitude variations with azimuth studies and add accuracy to lithology and fracture prediction. Where these attributes are used to drive anisotropic depth migration, significant differences in structural positioning may result
Steep dip migration algorithms have significantly augmented time imaging capabilities. Ray-traced 3D prestack time migration (PSTM) has proven successful in addressing imaging limitations unresolved on older datasets.
The removal of unwanted noise is a common reprocessing goal, be it ground roll, multiples, marine interference, wind noise or cultural noise. Whatever it is, some dataset exhibits the problem. Incremental advances in noise removal techniques can provide significant improvements in interpretability.
Poor statics solutions, the bane of many existing datasets, may be significantly improved by new approaches. The application of high-fidelity ray tracing and tomographic inversion gives a much more detailed model of the near surface, reflected in the quality of the stack response. In such cases statics and topography can be addressed in a dynamic fashion by incorporating the near-surface model into depth migration.
Regional context
Many reprocessing projects involve the integration of multiple vintages and surveys of seismic data. The objective may simply be a set of data that will tie at line intersections or volume overlaps. Placing a discovery in regional context will often require such a project. A final set of consistent data may allow for further play development and more comprehensive understanding of the geologic history. Of course the intrinsic quality of the reprocessed result should be higher, particularly when new technology is deployed alongside detailed input from the geoscientists working the data.
Reservoir characterization
These projects are of growing importance. Reprocessing objectives are well-defined and focused on a producing field. Well data is available for calibration of the seismic dataset, something that rarely exists on a first-pass processing project.
Producing a well-resolved and accurate image is always an objective; however, a significant investment in controlling amplitude and phase is essential to rock properties estimation. Estimation of porosity distribution, fluid factor, fracture patterns, pore pressure prediction, lithology and fluid prediction are all examples of what may be achieved with the right data and processing. Exciting new developments in prestack attribute analysis promise the opening of new vistas bringing interpretation and processing closer together. More of this technology is moving the arena of the geophysical contractor, facilitating the synergies required of this type of reprocessing project.
Remediation of infield processing
Infield processing, both on marine vessels and land crews, is often designed to produce reconnaissance-type products. While the results may meet contractual requirements and initial objectives, these often are limited by constraints on compute and personnel resources. Many of these products are reprocessed with more aggressive geophysical goals within a short time of the initial product.
Nonexclusive reprocessing
A great percentage of newly acquired seismic data resides in the data libraries of geophysical contractors. As a result, a range of advanced products, once the exclusive domain of in-house technology groups, is becoming available on a nonexclusive basis at competitive rates. In the deepwater Gulf of Mexico, 3D PSDM products are being generated on large volumes, 100-plus Outer Continental Shelf blocks at a time, on a nonproprietary basis. Typically these depth migrations supplement time-processed datasets delivered as the basic product.
Deepwater hazard surveys
There is a growing trend toward reprocessing existing datasets to generate hazard assessment volumes. Using high-quality deepwater 3D marine surveys, this practice negates the time and reduces the expenses associated with the acquisition of a traditional hazard survey. These products, combined with state-of-the-art visualization facilities, can dramatically reduce the efforts required to complete hazard studies.
Time-lapse seismic, 4D
Studies show that as time-lapse surveys accumulate over a field, evolving data processing technology requires reprocessing of all acquisition snapshots, ensuring that all difference or monitor datasets are processed to the same paradigm. It is important that each time-lapse snapshot be as representative of conditions in the reservoir as possible. Hence the most effective (subject to sensitivity analysis) processing sequence available should be applied to all datasets. This could require frequent reprocessing to include the latest technology in many vintages of data as each new survey is shot.
Try it and see
This is an almost obsolete practice of reprocessing data with no real objective other than making it look better.
Advances in data processing technology combined with new visualization and interpretation techniques are facilitating the extraction of more useful information from seismic data. These enhancements are essential in solving the more demanding questions being asked of seismic data today by a changing industry.
Once exotic tools are now common and indeed necessary for the demanding environments in which geoscientists operate. While new technology is always exciting, it has its limitations. For a successful reprocessing project, it is important the objectives be well-defined and understood by all parties.
While the easy and more obvious exploration opportunities are exploited, a higher standard of geophysical integrity is required of seismic data to find the remaining prospects. The reprocessing of existing datasets using new technologies, combined with focused input and information from oil finders, can revitalize a dataset, bringing hitherto unrecognized exploration potential to light.