If you think magazine editors spend all of their time at trade shows being wined and dined by potential sources, you’re only partially correct. We also work exceedingly hard trying to sniff out the newest and coolest of all of the technologies being displayed

Two recent shows — the Offshore Technology Conference (OTC) and the European Association of Geoscientists and Engineers (EAGE) — proved to be extremely useful in this pursuit. While OTC doesn’t have a lot of exploration-related technology on the show floor, there was a technical session devoted to permanent monitoring that had an interesting surprise — four of the seven presentations focused on fiber-optic technology.

This is of interest to me because I’ve been following the development of fiber optics in the oilpatch fairly closely. Fiber-optic pressure and temperature gauges are now fairly routine, and a system that measures distributed temperature is now available as well (see our September issue for more information). But its use in seismic has been hampered by the need to combine large numbers of sensors onto a small number of fibers as well as signal-to-noise issues.

StatoilHydro gave two presentations, one on a fiber-optic cable system it has developed for permanent monitoring and another case study over the Snorre field discussing the decision-making process for finding the best monitoring system for the field. Systems considered were streamers, steerable streamers, redeployable ocean bottom seismic (OBS) systems, and permanent electrical or fiber-optic seismic systems.

The primary needs of the license holders were to optimize drilling locations and improve their drainage understanding. The research and development group wanted to maximize repeatability to see small 4-D changes and also to have a robust system with low health, safety, and environment costs.
The permanent fiber-optic OBS system was chosen. A 6-mile (10-km) cable was tied to the platform, and a Linux system was installed in Trondheim. Within 30 seconds of the shots being fired, the data was received.

PGS has also developed a permanent fiber-optic system and is installing it on the Ekofisk platform in the North Sea. The sensors can be deployed in very deep water and have a stable response in varying pressure and temperature scenarios. And Stingray Geophysical discussed the optimal architecture for fiber-optic sensors to overcome size limitations and noise issues.

At EAGE most of the news revolved around software developments. Landmark gave a one-year update on a system it is developing with StatoilHydro to develop a next-generation interpretation system. Launched at last year’s EAGE, the program is expected to take three years and cost US $13 million, the largest partnership in Landmark history. The goal is an intuitive basin-scale simulation tool.

The tool will be all-inclusive and will scale from the basin to the prospect level while truly integrating available data, including satellite images, bathymetry, geological surveys, etc.

According to Kenny Laughlin, senior product manager-Geological and Geophysical Technologies, the move from paper maps to computers in the 1980s bypassed the point in the interpretation process that brings in the interpreter’s geological knowledge. “We’ve lost the creative ability to interpret in between the hard data,” he said. This tool allows the user to zoom out as far as possible to understand the geological process.

Another software development that’s just getting underway is happening at FFA. The company has been focusing on volume processing and 3-D seismic computing attributes, deriving geobodies by imaging faults and structural elements. Recently it has teamed with Hewlett-Packard and NVIDIA to take advantage of the new compute power available to the industry through NVIDIA’s ability to use graphic processing units to process seismic data.

“The computer power available to the industry has taken a step forward,” said Stephen Purves, technical director for FFA. “When we’re computing on a workstation with NVIDIA graphics, we can do a lot more processing on the desktop.”

Their plans are to take the volume processing engine and technique and parallelize it with the new technology, which will allow rapid seismic data analysis.

Finally, EMGS has launched Clearplay, the first fully integrated EM system providing seamless end-to-end EM services and products. The ultimate goal is to improve exploration efficiency and performance and reduce finding costs and risks.

I’ll fill you in on the dinners next month.