Of all of the tasks that go into finding, drilling for, and producing oil and gas, seismic interpretation is probably the most qualitative process. It relies as much on instinct and experience as it does on technology and number crunching.

But multiple factors are converging to make interpretation more representative of the actual subsurface information hidden in the data. Advances in seismic acquisition and processing, better data integration, and improved compute power all are playing a role in giving the interpreter the best possible tools for the task.

Acquisition

A seismic program is only as good as the raw data provided by the acquisition team. “Junk in, junk out” is a common term for data that are poorly acquired – the best processing algorithms in the world won’t make sense of data that are not representative of the subsurface.

Recent advances in seismic acquisition technology have revolutionized the world of interpretation. Among these, broadband acquisition, where a broader spectrum of frequencies is recorded, has had a huge impact on the ability to better interpret the resulting data.

Iain Brown, vice president of reservoir services for Petroleum Geo-Services (PGS), said that broadband acquisition such as that enabled by PGS’s GeoStreamer and GeoSource technology provides “ghost-free” data. In marine surveys using this technology, the reflections caused by the sea surface, which acts as an almost perfect mirror, are removed by using a dual-sensor recording system. The ability to remove these unwanted reflections, which interfere with the signals from the subsurface, has enabled the towing of streamers at a greater water depth, which in turn has resulted in the recording of much lower frequencies.

The fundamental benefit of recording more usable high and low frequencies is that the seismic signal that is reflected in the subsurface is much sharper and has significantly reduced side lobes, he added. It is, in particular, the richer low-frequency content in broadband streamer data that significantly benefits quantitative interpretation and reservoir property estimation, making these processes more precise and reliable.

Processing

Processing algorithms start to sound like alphabet soup after a while, but two recent advances – reverse time migration (RTM) and full waveform inversion (FWI) – are revolutionizing the way seismic interpreters view the subsurface.

Brown said that RTM is performed routinely in the PGS processing department, particularly when imaging complex geological structures. FWI is a more recent arrival on the scene, but the fact that it uses a two-way wave equation to invert for high-resolution velocity models in depth allows the creation of very accurate models.

The benefit of broadband acquisition designs is that they provide the low frequencies that are critical to the FWI algorithm, Brown said, adding that a recent survey on the Johan Sverdrup field in Norway created a very precise velocity model for the shallow section of the field, resulting in a much improved image of the reservoir section. “FWI combined with separate wavefield imaging using the multiples as an extra source enabled the imaging of this shallow section very accurately, which then translates into improved imaging and better depth prediction for the lower reservoir levels.”

The rapid uptake of RTM bodes well for FWI, and the primary constraint currently is the run time. “Computing is advancing so quickly with continuous improvements in high-performance computing and storage, resulting in faster turnaround of these processes and making them more accessible year on year,” Brown said.

Data integration

The best seismic interpretation is one that is consistent with as many different and independent measurements as possible. This integration of different pieces of information has been one of the holy grails for seismic interpreters for years, and it’s proving to be a tough nut to crack. But progress is being made.

Brown recalled working for an operator in the 1990s where very little teamwork took place. Geologists, geophysicists, and seismic interpreters worked in their individual silos and didn’t integrate their efforts until near the end of the project.

“That practice is very rare these days,” he said. “But even today there is clearly a thirst for an affordable software database platform that allows subsurface data types to be seamlessly integrated.”

While he noted that some solutions come closer than others, none provides a truly seamless work environment that includes all of the data types and disciplines that an interpreter might want to work with. Currently, most oil companies use a toolkit comprising different software products – one for seismic interpretation, another for petrophysics, another for basin modeling, another for potential fields, etc. Larger companies might rely on a particular platform that meets most of their interpretation needs, but, said Brown, “There’s nothing out there that covers the whole range. I think the challenges are both software-related and workflow-related.”

He noted that the cost of moving legacy data into such a platform could be prohibitive. “Computer operating systems and hardware and languages all change over time.”

The size of the company also makes a difference. While larger companies have integrated asset teams, smaller companies might have one or two individuals who are forced to wear a variety of hats to meet their companies’ interpretation needs. When a specialty such as geochemistry or potential fields is required, they sub-contract that project. “That expertise then doesn’t reside within the unit,” Brown said.

Seeing the big picture

One advantage that a seismic data company has over operators, Brown said, is the fact that they own large data libraries that their interpreters become quite familiar with. Instead of interpreting, for example, a targeted survey of 500 sq km (193 sq miles) of 3-D data, PGS acquires huge surveys and even knits multiple surveys into what are termed MegaSurveys that cover tens of thousands of square kilometers.

“You can get a true basin-wide regional perspective of these hydrocarbon basins,” he said. “When interpreting a smaller survey, you’re going to see things in those data such as part of a fan sand complex, for example. However, you will never get the whole picture, and that makes interpretation difficult.

“If you have access to larger datasets, you can see not only the end of the lobes of the fan system but the whole system. That brings a totally different insight into the morphology and sedimentology and how the facies developed over time. It makes play fairway analysis and hydrocarbon migration much easier to understand,” he added.