What on earth is going on with exploration technology development? It’s like R&D on steroids. Technologies seem to go from great idea to new launch to old hat in months, not years. It’s future shock all over again.

From geology to geophysics to petrophysics and well testing, the past few years have seen tremendous technological strides. Here are a few of the most notable.

Geophysics

At the recent Society of Exploration Geophysicists (SEG) annual meeting, Saudi Aramco held a special event outlining its strategic research initiatives in geophysics. It was not meant to be a roadmap for future R&D, said Panos Kelamis, chief technologist-geophysics technology for Aramco’s EXPEC Advanced Research Center. “This is not a vision,” Kelamis said. “The train has left the station.”

The event outlined several areas of current investigation. The first is automation, which seeks to improve efficiency and data quality while reducing acquisition, processing and interpretation cost. Saudi Aramco is involved in a major industry collaboration to build a commercial automated shallow marine seismic system using AUVs that can be positioned and retrieved anywhere on the seafloor. The hope is that increased automation will lead to increased use of seismic data in reservoir monitoring. This, in turn, will require processing and interpretation automation to handle the massive quantities of data.

The second area is to bring geophysics closer to the reservoir. This increases seismic data fidelity and provides better vertical resolution, something which will be of great use to engineers in their field development schemes. Already Aramco is drilling holes to bury geophones beneath the near surface, which in the Middle East is notorious for wreaking havoc with surface seismic signals.

Advances in seismic acquisition have already astonished the industry within the past few years. On land the push has been to increase sampling capability, and a variety of techniques have become routine, including slip-sweep acquisition and simultaneous source acquisition, in which multiple sources are implemented at the same time with the cross-noise later filtered out in processing. Wireless acquisition systems also are becoming more routine as some of the early bugs have been worked out. In fact, earlier this year Wireless Seismic set a record for real-time wireless recording of seismic data, using 11,000 active channels with real-time data transmission from a live patch of 6,400 channels on a survey in Kurdistan.

Geophones also are getting a new look. Shell and HP partnered several years ago to create sensors based on microelectromechanical systems. The criteria for performance were a low noise floor and higher fidelity in the form of low-frequency data as well as ease of deployment and wireless functionality.

Another type of sensor was announced at the 2014 SEG meeting. Silicon Audio launched a new seismic sensor that is the same size and shape as a conventional geophone but uses a laser to read the motion of a vibrating proof mass, achieving a high signal-to-noise ratio and recording lower frequencies than traditional geophones.

Strides are being made in the borehole as well. Sercel recently launched a vertical seismic profiling (VSP) system specifically designed for HP/HT environments. And fiber optics continue to play an important role. Paulssen Inc. announced OpticSeis, a permanent or redeployable borehole seismic system for microseismic as well as 2-D, 3-D and 4-D VSPs. These are HP/HT tools that can record up to 1,000 three-component levels and can be used in both vertical and horizontal wells. And a new company called GeoOptics, a division of MagiQ Technologies, is developing a fiber-optic seismic sensor to aid in microseismic monitoring.

Though microseismic isn’t an exploration tool, per se, it has its roots in exploration technology. This is a technology that has simply exploded on the scene over the past few years since it helps address fracture monitoring in shale plays. Several companies offer a combination of surface and buried arrays, and Paulssen is working with the Research Partnership to Secure Energy for America to develop a downhole tool.

In the marine environment, the biggest acquisition news is being shared by two types of technology—broadband seismic and nodal seismic. Broadband is typically a towed-streamer application and can include hardware and/or processing aspects—it delivers a broader frequency image than is obtained by conventional methods. Nodal seismic systems are deployed on the seafloor and provide data of much higher quality due to the quieter environment and the full-azimuth nature of the data recorded. These aren’t totally new systems, but their acceptance has grown as the equipment has improved and deployment costs have come down.

The ability to do simultaneous source shooting will help reduce nodal costs even further, said Paul Brettwood, vice president of technology and strategic marketing for ION. “One of the constraints with seabed is that you compensate for the sparse receiver sampling by doing a huge number of shots,” he said. “But if you can have three or four vessels out there firing at once, your time to shoot that dataset will be reduced significantly.”

Acquisition schemes offshore have improved tremendously, particularly with the advent of wide-azimuth (WAZ) seismic, which often uses multiple source boats shooting from a variety of directions. More recently Schlumberger advanced the concept of coil shooting, where the vessels shoot in a circular rather than a linear pattern. That company also introduced IsoMetrix, which is basically a combination of broadband and WAZ since it provides finely sampled data in all directions.

The processing arena is undergoing some of the most notable advances, made possible in part by continually improving compute power. Reverse time migration, which was a new technique just a few years ago, is now routine, although more recent efforts have improved the highest frequency at which data can be processed.

Another advancement has been in bandwidth preservation. “In the past you would do surface-related multiple elimination and various other things, and you would lose bandwidth in every stage of the process,” said Jacques Leveille, senior vice president of technology and communications for ION. The goal, Leveille said, is to redesign algorithms so that bandwidth is preserved at every stage. An example is accurate velocity determination using nonparametric tomography. “It’s not using the predetermined shape of a curve,” he said. “You let the data tell you what shape this thing has.” Processing the data that way allows one to use the broader band data in rock property determination.

To that end ION launched PrecisION, an inversion algorithm that performs analytical processes in the Eigen domain, a domain in which usable data is more readily separated from noise.

There is another recent breakthrough in processing—full waveform inversion. A year ago it was an interesting theory that people were trying to figure out. Now it’s offered by all of the major service providers. “We’re actually deriving a model that’s constrained by the data,” he said. “You’re building the model as you process the data.”

Another fairly recent arrival on the scene is uncertainty modeling. Both Roxar and ION have tools that include error bars to indicate the level of uncertainty present in the model.

In the interpretation arena, quantitative interpretation is becoming a more common tool. Quantitative interpretation enables interpreters to quantify reservoir conditions such as geomechanical properties, lithology, and rock and fluid properties. It is a sought-after tool in shale plays where this knowledge is so important.

And on top of all of this, seismic contractors are trying to figure out ways to reduce cycle time. “There’s always a drive to reduce cycle time to get the end product to the oil company more quickly,” said Brettwood. “There’s an even greater drive within the unconventionals because they are punching holes every few days, and they need the information in time to impact the drilling decisions. If we’re going to have any impact at all from geophysics, it has to be delivered quickly.”

Multiphysics

The third area of Saudi Aramco’s investigation is “multigeophysics,” more commonly referred to as multiphysics. This is not a totally new concept since seismic data have been integrated with well logs, gravity and magnetics, and electromagnetics for many years. But the next step is to develop technologies for joint inversion of these data types.

This is obviously an area of key interest. CGG, for instance, recently formed a new multiphysics business line to combine its airborne business with its GravMag Solutions group. Repsol launched its Sherlock Project in 2009 to characterize the elements of petroleum systems and fluid behavior to improve recovery and production rates. It is made up of a variety of tools that integrate knowledge from geology, geochemistry and high-resolution analytical chemistry.

“We’re experimenting with multiphysics,” Leveille said. “It goes back to the compute power and the algorithms and efficiency and also the cleverness of the algorithms since we will be using more measurements.”

One new area of development that fits in nicely with the multiphysics concept is digital rock physics (DRP). Pioneered by companies such as Ingrain and FEI as well as Stanford University, DRP combines 3-D pore scale imaging and computation to compute rock matrices from thin sections of cores or cuttings. This provides a quantitative understanding of the reservoir at the pore scale.

Petrophysics and sampling

One way to characterize a reservoir is to bring rocks and cuttings up from the wellbore. Another way is to stick a logging tool down the hole.

Well logging has been around since 1927, but the tools continue to see huge improvements. Some of the holy grails with logging tools are depth of investigation and the ability to withstand high temperatures and pressures. There also are constant efforts to improve the value of the information gleaned from the tools.

Schlumberger, for instance, introduced the dielectric scanner, a tool that directly measures water volume and rock textural information. This allows operators to determine the volume of hydrocarbons in carbonates, low-resistivity or low-contrast shaly and laminated sands, and heavy oil reservoirs as well as water salinity and clay volume. Its Litho Scanner high-definition spectroscopy tool provides accurate mineralogy and total organic carbon from quantitative elemental spectroscopy. The tool allows for the detailed description of complex reservoirs by measuring multiple elements.

Baker’s high-definition induction log service provides formation resistivities at six depths of investigation ranging from 10 in. to 120 in. This is useful in thinly bedded reservoirs and in the presence of deep drilling fluid invasion. And its Nautilus Ultra service provides a comprehensive logging suite that can withstand high pressures and temperatures.

Formation testers also have seen major enhancements. Halliburton recently introduced its CoreVault system, which keeps rock samples in a sealed container, preserving 100% of the fluid in the core for analysis. This enables more accurate estimates when making decisions about the reservoir.

Schlumberger’s Quartet downhole reservoir testing system allows operators to isolate, control, measure and sample their reservoirs in a single trip. The system combines several tools, including the SCAR inline independent reservoir fluid sampling tool, which collects samples directly from the flow stream. Wireless telemetry allows the operator to interact with the downhole tools.

WellDog and Shell recently announced a collaboration to commercialize WellDog’s downhole testing technology, which uses Raman spectroscopy to measure chemicals at specific depths using lasers. Industry response to date indicates that the tool holds the promise to help operators optimize completions in shale plays.

Overall, the pace of technological change seems to reflect the pace of the industry in general. And the lines are beginning to blur. “Going from the grand scale to the pore scale is becoming more routine,” Leveille said. “It’s truly geoscience as opposed to the various discrete disciplines, and it’s already getting mixed in with engineering.”

As for the next few years? “It’s going to be a wild ride,” he said.