In a highly competitive industry like oil and gas, secrets are often held close to the vest. This might give one company an advantage over another. But it doesn’t do much for technology advancement.

In response to the necessity for increased collaboration, the Society of Exploration Geophysicists launched the SEAM project a few years ago to provide a cooperative research environment to help solve some of those thorny problems that no single company is likely to overcome. “It’s set up to provide challenges to the industry through actual seismic and other geophysical data,” said William Abriel, a geophysical adviser for Chevron and vice chair for SEAM. “The concept is to see where the industry wants to make progress and to provide a safe place since people who, if trying to do this on their own, would find it difficult, time-consuming or expensive.”

Already SEAM has completed several projects. The first project was a subsalt modeling project over a 60-block area of the deepwater Gulf of Mexico. The model has been constructed in a form that enables extension to other complex environments.

Phase II involved three different models: unconventionals, near surface, and foothills overthrust. Abriel said the first model has been built and distributed, while the second two models are still under simulation. The full project is expected to be finished by year-end 2015.

Next up

Two new projects are now underway, both of which promise to help the industry better understand difficult problems. The first is a project funded by a Research Partnership to Secure Energy for America alliance called pressure prediction. “It’s for the deepwater Gulf of Mexico,” Abriel said. “The concept is to understand the approaches to working with pressure prediction and the different mechanisms for pressure. The geophysical simulations will help us to appreciate how to predict pressure using different techniques.”

Some of these mechanisms include the compaction disequilibrium mechanism, the centroid mechanism, and chemical changes that generate pressure such as smectite-illite transformations and hydrocarbon cracking, he said.

The plan is to reengineer the salt model from Phase I, although Abriel said this will be up to the participants. He added that it’s difficult to resolve pressure effects just with geophysical data.

“That’s one of the advantages of having known answers like the model information,” he said. “It helps so much to measure what the impact of geophysics is on those pressure measurements.”

The second project will look at life-of-field seismic, targeting not the exploratory practices but the production and development activities in the oil and gas industry. Abriel explained that no “case history” exists that would allow people to test their theories about reservoir dynamics. “In real life, you don’t actually know what’s going on between any two wells,” he said. “There is no benchmark.”

The goal is to provide that benchmark. Participants will provide input into which types of reservoirs should be studied, and these will be put into context in “a box of exciting geology” with a structural and stratigraphic framework.

Once the reservoirs are ensconced in their geology boxes, production scenarios can be tried—depletion, water injection, well trajectories, gas cap expansion. This will enable operators to study potential effects of different scenarios on different types of reservoirs.

The idea, Abriel said, is to combine forward geological modeling with reservoir simulation. These two models typically occupy different scales. Seismic simulation will attempt to bridge that gap and also bring in geomechanics.

“The point is, we’ll control it because we’re building it from our imaginations using concepts that we already know,” he said. “It won’t look like it’s from Jupiter. It will look like Earth reservoirs that we understand.”

Another aspect of this project will involve integration of data from seismic, gravity, electromagnetics and wells. All of these data can be generated synthetically. This will enable users to enter this “nondynamic cube” to first do exploratory work and find the reservoirs. They can do the structural and stratigraphic mapping, place wells, build their reservoir characterization model and then build the forward model to compare the reservoir behavior to the reservoir simulation.

“If we could predict reservoirs perfectly, we would do this once and then walk away,” he said. “There would be no production management. Given that there’s a gap between what took place and what you guessed, you need to close that gap.”

Abriel added that anyone who wants to illustrate life-of-field activities would be interested in this model. “What a fabulous teaching and training tool!” he said. “It’s a dataset where you already know the right answer, so that’s of some value.”

It also will offer the opportunity to introduce uncertainties into the dataset in a controlled manner. These might be a wellbore being misplaced by a few meters, seismic data with tide effects that weren’t taken into account properly or an invasion problem with the well logs.

“These are all uncertainties that you can introduce,” he said. “And the degree to which you can understand, estimate, capture and describe uncertainties with these data becomes another one of the deliverables.”

Ultimately, Abriel sees oilfield management in the future consisting of building what is understood of a model and then altering it as companies learn more about their fields. “That level of software integration will enhance our ability to manage reservoirs significantly,” he said.

Contact the author, Rhonda Duey, at rduey@hartenergy.com.