Necessity is the mother of invention, the saying goes, and that adage definitely applies to the upstream oil and gas industry. There was little necessity for exploration technology in the 1800s when some of the first oil wells were drilled. The oil obligingly came to the surface. But by the early 20th century inventive folks began to realize that there were vast reservoirs of oil and gas that weren’t nearly so obliging, stubbornly hiding underground.

As a result, new methods of remotely imaging the subsurface were conceived. The first known seismic surveys were conducted by John (Clarence) Karcher and his co-experimenters in Belle Isle, Okla., in 1921, according to Encyclopedia of Earth, and the first electric wireline log was run in 1927 by Conrad and Marcel Schlumberger, according to Schlumberger’s website. Fast-forward 90-some years, and the landscape of exploration technology has matured at a phenomenal pace. And it shows no sign of letting up. Whether in data acquisition, processing or interpretation, strides both within and outside of the industry promise to facilitate the necessity of finding oil and gas.

Acquisition

Some of the most impressive strides in recent years have come during the data acquisition phase. And many of these are being driven by other industries.

A newcomer on the scene is drone technology. Drones are being examined by the industry today primarily as low-cost and safe inspection alternatives to airplanes and helicopters. But as the Federal Aviation Agency in the U.S. hustles to develop rules governing their use in oil and gas, Scottish researchers already are examining their utility in analyzing remote, inaccessible outcrops of North Sea reservoirs, according to the Houston Chronicle.

Drones are not likely to be acquiring “aerial seismic” any time soon, but other airborne measurements such as gravity and magnetics could in theory be acquired this way. Drones also can be useful in characterizing a site prior to beginning operations.

“Before we go out to drill a well, look for a site and shoot seismic in an area, we have to do a fair amount of site inspection,” said Ken Tubman, vice president of Geosciences and Reservoir Engineering for ConocoPhillips. “We can get topography from satellites, but we can also potentially get a much better look with drones.”

Beyond the possibility of providing aerial views and perhaps even subsurface images, the ability for drones to operate in remote environments has HSE implications. Ross Saunders, chief geophysicist for Energy XXI, said that some companies are considering using drones to place and retrieve seismic receivers. “I believe that within 20 years it could easily become something that is an everyday occurrence,” he said.

Already researchers at the Delphi Consortium at Delft University of Technology have introduced the dispersed source array (DSA) concept for simultaneous shooting on land. Rather than using broadband seismic vibrators, which are heavy and expensive, the consortium has experimented with the concept of simultaneous shooting with DSAs, producing a larger temporal and spatial bandwidth. The simplicity of these sources could lead to autonomous acquisition, they argue.

According to Dirk Smit, chief scientist, geophysics for Shell, microelectromechanical systems (MEMS) technology holds considerable promise for miniaturizing geophones without loss in fidelity. Shell has been working on the design of such a system, which has borne interesting results.

“As we all know, it’s not easy to improve on the geophone design,” he said. “It’s a remarkably simple yet very accurate and robust instrument. But we do see that several MEMS solutions have now indeed become superior to what you probably can get with an upgraded analog geophone system, even digital geophone systems. It will start to turn seismic measurements into a commodity.”

While geophysical contractors may wince at these words, Smit argued that additional uses of seismic in the future such as more monitoring of production processes will require more acquisition, both per square kilometer and per unit of time. Hence, the pie will be much larger, but the systems will need to acquire surveys “at a significantly lower cost base,” he said. “And I don’t think any of the technology we have today is able to deliver that except for MEMS technology. It takes a very clean and accurate measurement, in particular at low frequencies, at an ultra-low cost.”

Ultimately, Smit noted, more sophisticated techniques may eventually revolutionize land geophysical acquisition. “I think that subsurface characterization or exploration will be more driven by more remote-type sensing technologies,” he said. “Perhaps this could be combining more refined measurements of surface expressions affected by climate, biology or geology with probing technologies that can be deployed airborne or on the ground.” These could include techniques like gravity and magnetics that probe the surface in addition to surface-based measurements, he added.

“I think that the required sensor technologies will become available because of the use of mobile phones,” he said. “We might be able to monitor subsurface effects by simply (appropriately) using the smartphones of people since they contain all kinds of measurement technologies.”

These developments will be particularly useful in the unconventional arena, he added. He sees the key to global exploitation of unconventional resources as the ability to characterize the subsurface to determine producibility with accurate yet low-cost geosciences technology.

Marine acquisition

Marine seismic also can benefit from the use of different acquisition philosophies. Saudi Aramco has introduced the concept of “RoboNodes” along with CGG and Seabed Geosolutions. These autonomous marine nodes can be programmed to move under water and are controlled by an acoustic system. Their use can solve one of the major issues of underwater nodes—the cost.

The RoboNodes are still under development, but Saunders said that even more traditional forms of seabed acquisition are likely to come down in cost. “There have been some vendor partnerships and alliances recently announced that are going to work together to increase the coverage of nodal data on the [Gulf of Mexico (GoM)] shelf,” he said. “When they get more participants, the price will go down. Just like any new technology, they’re going to achieve greater efficiencies.”

Another relative newcomer on the scene is the Wave Glider developed by Liquid Robotics, which is now a Schlumberger company. These hybrid wave- and solar-powered ocean “robots” are designed to cover vast stretches of ocean without human interference and can be equipped with a variety of sensors to collect weather, currents or even seismic data. In fact, Wave Gliders already have been used in the GoM to collect seismic data, the first time they have been used for that purpose.

Advantages are numerous. These systems can be controlled or pre-programmed to follow a prescribed path and are able to withstand Sea State 8 conditions, which involve waves of up to 14 m (46 ft). They are able to operate in fringe conditions that include low solar or wind environments, and they come with an auxiliary electrical thruster. They can house more than 12 installed payload sensors using up to 24 high-performance payload computers, and they offer plug-and-play payload capability as well as a “data center at sea” that processes large volumes of data and transmits them in real time. Information from Liquid Robotics’ website indicates that Wave Gliders could reduce the cost of a marine seismic survey by as much as 90%.

Tubman said that ConocoPhillips has recently deployed Wave Gliders in experiments to test their utility as data gatherers. “Do we know exactly what we’ll do with them?” he asked. “Maybe not, but for exploration my vision is that we put nodes on them for seismic, and they go off on their own. All of a sudden we have large, randomly distributed quantities of sensors that go in and around facilities in all kinds of ways that we can’t get with our large ships at the moment.”

Other recent advances in marine seismic acquisition are more commercial. For instance, BP tested the concept of wide-azimuth (WAZ) seismic eight years ago, and the concept has caught on quickly since then. WAZ involves shooting a survey from multiple angles to provide better, more geometrical illumination of the subsurface.

Tubman was working for Veritas at the time that BP proposed the survey. “I give BP credit,” he said. “Together we worked it out, and it made a big difference.”

Saunders and his company were involved in the first WAZ survey on the GoM shelf, partnering in the Main Pass area with operator Apache Corp. and Fieldwood. He said that Fairfield Nodal shot an ocean-bottom nodal survey over about 100 blocks.

“We’re in the processing stage now,” he said. “We’ll be able to get a first look at some of that data soon. Early indications are that it will be a step-change in data quality. It’s exciting to not just read about it but to be a participant.”

Original WAZ configurations have been tweaked over the years to include full azimuth, rich azimuth, multi-azimuth and coil-shooting full azimuth, the latter developed by WesternGeco. The coil-shooting technique allows the recording vessel to record continuously by traveling in circles rather than straight lines, according to Schlumberger’s website. The methodology offers advantages over traditional WAZ surveys because it requires only one vessel.

More recently the concept of broadband acquisition has come on the scene. Useful in both land and marine environments, broadband seismic records the full range of frequencies in a survey, according to CGG’s website. “High-fidelity, low-frequency data provide deeper penetration for the clear imaging of deep targets as well as providing greater stability in inversion,” the website notes. “Inversion of broadband data has been shown to provide better well ties, better correlation to geology and better resolution than inversion of conventional data.”

Smit said that Shell started looking into broadband (low-frequency) acquisition in 2004 or so, developing a research consortium with PGS to improve processing. It was not the first of its kind—several universities had already started evaluating the possibility of incorporating lower frequencies.

“But as soon as an oil major is involved in the early derisking or development of these technologies, a lot more credibility and robustness is derived from that,” Smit said.

Processing

All of the great acquisition technology in the world is of little use if computers can’t process these huge amounts of data in a timely and useful fashion. Luckily for explorationists, computers are finally up to the task.

“A lot of this theory was done in the ’50s,” Tubman said. “It’s only recently that the compute power has caught up.”

One by one these theories have turned into commercially available algorithms, from amplitude vs. offset and prestack depth migration in the ‘90s to reverse time migration (RTM) and, very recently, full waveform inversion (FWI). According to Schlumberger’s website, FWI uses a two-way wave equation to produce high-resolution velocity models. “It performs forward modeling to compute the differences between the acquired seismic and the current model as well as a process similar to [RTM] of the residual dataset to compute a gradient volume and to update the velocity model,” the website notes.

Energy XXI is analyzing the benefits of RTM and FWI, Saunders commented, explaining the benefits of the FWI process. “All prestack depth migration algorithms are based on mathematical approximations of the solution of the wave equation, which describes seismic wave propagation from the surface down to target horizons and then back up to the surface where they are recorded,” he said. “In the past, due to limitations in computer hardware, we were mainly using approximations of the wave equation called ray tracing.” With recent advancements in both compute power and seismic data quality, Saunders said, the industry has been switching to algorithms that are directly using the wave equation rather than its approximations. “This started by using RTM that is based on subsurface propagation and correlation of full wavefronts,” he added. “We are now progressing to use RTM as the basis for velocity estimation using FWI. The process is based on minimizing the difference between the full recorded wavefield and the simulated wavefield generated for each trial model. By doing that, we construct much more accurate velocity models, necessary for reliable imaging of our exploration objectives.”

Smit added that processing algorithms like FWI have come along just in time for acquisition techniques like broadband and WAZ. “When you see the types of data you can access through wide azimuth or broadband or both, you realize that techniques like waveform inversion may be very beneficial,” he said. “Before that, a lot of the waveform inversion techniques may have been contemplated in academics but never really had an impact on the industry simply because the seismic spectrum wasn’t available to stabilize a lot of the inversion that is hidden in the full-waveform inversion technique.

“You could argue that the uptake of these imaging algorithms is always following advances made in seismic data quality from the acquisition systems.”

Tubman added that many acquisition breakthroughs have resulted in more data to sift through. “Because we have the opportunity to collect so much data, we can’t actually look at them,” he said. “We talk about going from thousands of channels to tens of thousands of channels. How are we going to deal with that? It may be that some of these algorithms let us cross things out that we’re just not sure about yet.”

Interpretation

Seismic processing has increasingly involved the use of human intuition; seismic interpretation has relied on it all along. But there are techniques afoot that attempt to replace the human brain with computer science to arrive at better results in a more efficient fashion.

Geophysical Insights has recently introduced Paradise, a geoscience analysis platform that uses pattern recognition methods such as self-organizing maps (SOMs) and principal component analysis (PCA). These techniques, according to the company, allow interpreters to scan large volumes to reveal anomalies, discriminate the presence of hydrocarbons and direct hydrocarbon indicators, reveal geologic and stratigraphic features, and identify changes in pore pressure.

“I think this kind of technology has the potential to really make some great gains,” said Saunders. “I like the concept because it’s a nonbiased application, which means it doesn’t have to depend on an interpreter giving it certain parameters to run the neural network routine. So the results aren’t biased by well information.

“But it’s still in its infancy, so I think maybe once the methods are shown and start to deliver reliable and repeatable results, we might see things being used like that more commonly.”

The current scalable, client-server architecture enables independent or collaborative workflows, according to Geophysical Insights. It guides interpreters in the application of SOMs and PCA.

An older but still highly useful technology is visualization. This technology is intended to immerse interpreters in their data, enabling them to view them in a more immersive and collaborative sense.

More recent advances in visualization technology include holography, a technique that more accurately captures the three-dimensional aspect of a certain image. Saunders thinks the use of holography in exploration will continue to expand.

“Within 20 years I think holography and holographic imaging could very well be commonplace,” he said. “I think that’s certainly one that could go from science fiction to everyday use.”

Another newcomer on the scene is quantitative interpretation (QI). According to DownUnder Geosolutions’ website, QI uses amplitude analysis to predict lithology and fluid content in between wellbores. “This process should make use of all available data, assist in risk assessment, account for uncertainty and ultimately foster confidence in the predictions,” the site notes. The process relies on seismic inversion and rock physics analysis to quantify reservoir conditions.

While quantification is good, there is still something to be said for good old-fashioned human interaction. But even this might be more automated than is currently practicable.

“I think the real breakthrough is continued pushes on integration,” Tubman said. “Integration is a holy grail. We’re getting better at it, but we’re not there yet.

“Part of the way we’ve made big advances in imaging in the past is by combining the geologic model and driving the imaging with that. But there are other opportunities for integration—for instance, to have a model that will cycle back as we monitor the reservoirs.”

Added Smit, “I think that the expert effort is not necessary at every step of the process. But it can be done with other people and only at a few points really needs expert knowledge and insight.

“On the other hand, this could be accelerated so that the decision-making can be sped up rather than slowed down and so that we can make decisions that we can only dream of today.”