Whenever I’m tasked with coming up with a special project revolving around the seismic industry, the outline seems fairly straightforward — acquisition, processing and interpretation.

If Christof Stork is right, I’ll have to add a fourth element to that outline — modeling.

Stork is founder and senior scientist at Tierra Geophysical, a start-up company in Denver, Colo., that specializes in software products that help geoscientists model their seismic surveys to be sure that the information gleaned from the survey will justify the cost before the acquisition and will be accurate after acquisition. While this may seem like it’s putting the cart before the horse, it’s actually something that the industry has long realized could be a significant risk reducer. The problem is that it’s been too expensive to be practical for most surveys.

That is changing. “Not only is the process getting much cheaper, but with more realistic modeling, it’s accurate enough that it can reliably address key risk issues about a lot of large decisions,” Stork said.

Here’s how it works. “Once you get to the point of considering a seismic survey, you know the geology you’re looking for,” he said. “This technique is almost the classical scientific process, which we’ve almost never performed before in geophysics. You make a hypothesis, and then you test it.”

The finite difference model includes the hypothesized geology, and the computer simulates a seismic acquisition survey over it. Then the simulated data is processed and migrated to produce a sample image, which can then be compared with the original hypothesized geology. “The modeling is so realistic that it will include noise such as multiples, ground roll and surface scattering,” Stork said. “You can also include acquisition artifacts such as production platforms that create acquisition holes.

“Being able to see how the geology represents itself in the seismic can help you decide whether or not it’s worth acquiring data, whether to believe the image enough to drill a well, and how to best do reservoir characterization.”

This has been done before, most notably by BP, which modeled a wide-azimuth survey at its Mad Dog field before firing a single shot of the expensive ground-breaking acquisition. The cost was not insignificant. Stork said the company built a multimillion dollar modeling cluster and spent most of a year simulating the data. The success of the project, Stork said, means that “in a way, BP has done my marketing for me.”

He added that the combination of imaging more subtle targets and new acquisition hardware makes the time ripe for a modeling revolution. In addition, Stork said he could model the Mad Dog survey at a fraction of BP’s cost.

“My cost for that large survey might be (US) $200,000 or less,” he said. “The bottom line is that people can now do this realistic modeling at reasonable cost. The technology has been around for 30 years, and everyone knows it’s been great. But when the technology finally reaches the point where it’s cheap enough and people find the right use, it explodes. I think that’s going to happen now.”

Already he has sold a system to Shell. “That was a wonderful shot in the arm for me because Shell is very technical, has high standards, and they do most of their work in-house,” he said. “But they did several tests of my software and bought it in a snap.” WesternGeco also has licensed the software, he said.

Oil companies will remain his primary target, however. “I’m trying to communicate the potential of this technology to the big decision-makers in oil companies because they’re the ones that make risky decisions based on imperfect seismic data,” he said. “If somebody is going to acquire a $10 million seismic survey, drill a $50 million well based on seismic data or start a $100 million enhanced recovery program, knowing how much you can believe that data adds a lot of value.”

For more information, visit www.tierrageo.com.