The recently concluded SEG Annual Meeting provided some insight into the cutting edge of geo-technology. A distinguished panel addressed new developments in the exploration of inner-space.

A real "mover and shaker," Dr. Mark Zoback of Stanford University, reviewed the multifaceted Earthscope Project and the Integrated Ocean Drilling Program. Then he quickly focused on the project that has consumed the last 13 years of his life - studying and analyzing the tectonic stresses that relieve themselves occasionally in the form of temblors. Understandably, his focus has been on California's infamous San Andreas Fault, which, he revealed, shifts about 2 in. per year. This may not sound like much, and it wouldn't be if the shift was gradual and continuous, but as we all know too well the fault shifts violently and unpredictably, wreaking havoc among California coastal communities.
Zoback heads the San Andreas Fault Observatory at Depth (SAFOD), which is a unique phased project designed to examine closely the fault to determine how it is stressed, and, more importantly, attempt to predict when it will fail.
A lot is hanging on SAFOD's ability to develop a reliable way to predict quakes, but the task demands a methodical approach. Starting years ago with near-surface studies using an array of seismometers, Zoback's team has progressed to a phased examination of the fault itself through instrumented cased and cemented well bores. Just this summer, the fault was pierced by a deviated well drilled from a location 1.1 miles (1.8 km) west of the fault near Coalinga. For the first time, a borehole drilled a section that exhibits repeatable micro-earthquakes with lateral displacement of about 1 cm and a period of about 2 years. Coring has been completed through the 66-ft (20-m) wide fault area and the 820-ft (250-m) wide associated damaged zone. Armed with insight derived from its measurements and analyses, during the third phase of the project the team hopes to drill to the source of the stress, thought to be about 9,900 ft (3 km) deep.
Taking a philosophical approach, veteran Geoscientist Sven Treitel offered his perspective on inversion, which he defined as using data to create the geophysical model that would yield the data. Not quite the chicken and the egg dilemma, but nevertheless a bumpy road ahead, he predicted. Treitel provided a list of problems that complicate inversion techniques, not the least of which was the fact that a human interpreter is involved. He stated that structural inversion equals migration.
A fundamental question posed by Treitel is, "Can mathematics ever adequately describe nature?" He thinks not. What is needed, he says, is the right physics. Right physics may include several things, but at its heart is resolving uncertainty - a point loudly seconded by an ad hoc panel of 10 geoscientists, queried by Treitel. For example, turning to the electromagnetic (EM) issue, he asked, "How do you weight the influence of seismic and EM data when you integrate them into a model?" No one knows.
Treitel's Top 10 wish list to attack uncertainty begins with four biggies: much faster forward problem solvers, better solution sensitivity tests, better reality checks and constrained inversion. "The real problem," he said, "is that Mother Nature doesn't read our papers!"
ExxonMobil's Dr. Leonard Srnka, a proponent of controlled source EM surveying, made a strong case for the technique. While acknowledging that there is room for improvement in many aspects of EM surveying, Srnka said that the fundamentals of the physics are sound and are far from "new fangled ideas," having been pioneered by the Schlumberger brothers as early as 1934. Now enabled by modern acquisition technology and advanced computer capability, a total of 160 surveys have been run, 37 over a 3-year period by ExxonMobil. Srnka positioned EM surveying in the frequency spectrum between surface seismic and natural magnetotellurics. Hydrocarbons are indicated when the amplitude variation with offset deviates from the predictable exponential decay curve one sees when no hydrocarbons are present he said.
Topping Srnka's wish list as EM surveying gains traction in the international marketplace is seamless integration with seismic and log data to create synergy. Published case histories are needed to contribute to an EM database. Acquisition costs must come down and interpreters must improve the usefulness and accuracy of the data. Srnka maintains that controlled source EM surveying is applicable anywhere, but recognizes some problems that currently limit its abilities in carbonates and in the transition zone. These problems are resolvable, he said.
Looking ahead, Srnka envisages the industry will be able to deliver low cost reconnaissance EM logs, simultaneous acquisition with surface seismic as well as combination with wellbore seismic and 4-D EM solutions using permanent monitors. Processing will be facilitated by full tensor data using seismic-like techniques and allowing full integration with formation models. On the business side, the service is expected to provide identification of opportunities and hazards, intelligent field development using 4-D techniques and hydrocarbon characterization.
Self-acclaimed prospector, Mike Forrest, of Rose & Associates, postulated that risk analysis is the vehicle that will finally smash the barriers that have existed for years between geologists and geophysicists. Risk analysis - deciding whether to drill, or not to drill - is in fact a form of integration, he said, because both disciplines share this common goal. Forrest described a desirable case, sustained flowable hydrocarbons, which he labeled Pg (in probability terms, a P99 case). He then showed how geoscientists can modify their initial estimate of Pg by incorporating rock physics, seismic, and expert advice in a software program to derive a "Revised Pg" that can be introduced into a feedback loop to further refine the case. Seismic and rock physics data quality is key to amplitude analysis, he said.
Forrest presented statistical data from 118 wells to illustrate his point. Amazingly, the data he presented indicated that if Pg is greater than or equal to 35% usually the well is a success. Looking at it another way, Forrest defined ?Pg as the difference between initial Pg and revised Pg, and stated that any ?Pg over 20% indicated a successful well. Of the 188 wells surveyed, there were 58 dusters, of which 42% were wet sands, 24% contained low saturation gas, indicating lack of seal integrity, 26% of the wells had no reservoir at all and 5% were dry for miscellaneous reasons. In performing the analysis, Forrest said that the biggest mistakes were due to people using non representative datasets for calibration, and failure to integrate geological and geophysical data.
Colorado School of Mines professor Roel Snieder presented an example taken from Eugene Island block 330 that illustrated fault imaging. He postulated that geoscientists can create an image using a virtual source - seismic inferometry. To illustrate, he showed that using quasars received by ground stations together with very long baseline inferometry, it could be proved that the United States and Germany are actually drifting apart, albeit slowly. He invited the audience to visit www.lupus.gcfc.nasa.gov/ brochure/bintro.html to learn more.
BP's Brian Hornby took vertical seismic profiles (VSPs) beyond time-to-depth when he presented 3-D + VSP imaging and reservoir monitoring. Numerous examples were presented where seismic and VSPs were used to image the flanks of salt diapirs, and identify localized associated fault patterns. Hornby showed how a company employed a trial-and-error technique using surface seismic alone that caused them to drill an unnecessary well that could have been avoided had the 3-D VSP technique been used. Following the theme of Sneider's presentation, Hornby illustrated direct imaging of salt flanks using inferometry.
Revealing a glimpse of the road ahead, Hornby said that seismic single-well imaging is in the offing - although it requires both a downhole source and an in-well geophone array. Currently, the problem can be resolved using a walkaway VSPs, but it can be a source of errors. Again, inferometry can be used to re-datum the sources to the receiver array. He presented an example of a single-well image without a downhole source using VSP data to calculate the distance to the salt flank. More expensive algorithms combined with increased computer power and speed may chart the future, he said.
A technique that bears further study is migration of multiples, Hornby said. All too often, multiples data are overlooked or discarded in standard processing. However, VSP multiples have been shown to image much better than surface seismic in many cases, and can be used to effectively fill-in gaps.
Finally, the implementation of permanent in-well three-component fiber optic seismic arrays in the Valhall field was announced for Q4 of 2005, as well as a new similar project under the sponsorship of DEMO 2000. The objective is to fully understand the benefits and limitations of the technique in time-lapse seismic and micro-seismic monitoring. Within 5 years, we should see very large (60+ sensor) downhole arrays used in reservoir monitoring, Hornby said. And in 10 years, virtual cross-well imaging should be a reality he predicted.