The convergence of next-generation graphics computers with remote sensors and real-time analysis and graphics is the next revolution in business performance and optimization.

Volume visualization of seismic data was first commercially introduced to the energy industry in the early 1990s via the transformation of a market-leading medical imaging product into a volume visualization and interpretation product. This software fundamentally changed the way data are interpreted and modeled. From those humble, risky and experimental beginnings sprouted multiple products incorporating this revolution and a 100,000-fold productivity improvement in the process. Seismic projects pre-volume visualization were tens to hundreds of megabytes, with interpreters sampling 10% of the data, consuming many months to produce a few final maps. Today, data volumes have grown from tens to hundreds of gigabytes, all of the data are included in the interpretation and modeling, and final products are delivered in weeks.

Importantly, as volume visualization spreads throughout the geophysical community, a few innovators discovered this same methodology could also deliver tremendous value to reservoir models traditionally described by two-dimensional maps and cross-sections. Rendering a fully populated earth model complete with geological, geophysical, petrophysical, reservoir, production and facilities properties was the goal. Several products emerged with varying features, levels of complexity and geometrical integrity. But this market has lagged in importance for many years.

That is all poised to change. Newer volume reservoir products, especially volume products that include interactive well planning features, have demonstrated dramatic, immediate-term, tangible, measurable and significant value to the drilling and producing process. Returns on investment of 100:1 are documented for projects using these new volume tools. On the heels of these successes is the advent of aggressive new uses for visualization in optimizing reservoir management and production efficiency and effectiveness.

The next application of visualization, servicing the reservoir management and producing processes, offers new and significant challenges to integrate inherently and traditionally isolated work processes and driving the displays in new purpose-built rooms with advanced, centralized, distributed graphics servers. In addition, these processes will incorporate increasing amounts of real-time dynamic data into higher-resolution static reservoir decision and operation models defining the need for advanced, robust, performant, distributed and secure data management systems along with high-performance computational systems. All of these systems must support each other to optimize the data/analysis flow and work process yet remain sufficiently flexible, adaptable, scalable and configurable to readily transform in real time to whatever business process change the operation can throw at it. These systems must ensure that both year-long and hour-long projects can take fullest advantage of data, processing, visualization and people with the tightest bottleneck being wide enough to do the job.

Many wells are instrumented and more are being instrumented every day with traditional SCADA and other analog devices and newer fiber-optic and wireless digital devices. Surface sensors are being augmented with downhole sensors both as point measurements and as arrays. Especially in newer, remote and more expensive locations, wells are using the most modern technology available communicating to the reservoir management office via wireless or fiber-optic transmission. New revolutions in sensors are on the horizon, most notably in wireless, battery-powered nano-technologies (SMART Dust), whose versatility, adaptability and affordability will deliver detailed data from all parts of the operation, including the reservoir as well as the facilities, all in real time. Incorporating these data into real-time decision models will set the stage for much more fully constrained operating interfaces and resultant step-function improvements in both operating cost and recovery efficiency.

The hardware - where we've been, where we're headed

The Reality Center, driven by dedicated large graphics computers, has evolved as the standard for large presentation centers through widespread industry deployment. Ultra-large datasets, principally seismic, are interpreted and modeled in these centers at dramatically accelerated rates and the results are presented to management in a clear, easily understood way in support of exploratory wells. Nearly 200 of these systems, ranging from posh theaters to simpler workrooms, have been installed in the energy industry. More than 600 have been installed around the world in technology-laden industries such as automotive, aerospace, military, pharmaceuticals, and research and development, as well as oil and gas. In oil and gas, these systems have been dominated by the geophysical domain. However, there is rapid and focused migration of visualization technologies toward the operations work process which comes with its own, different set of demands.

The revolution in graphics is riding in the ruts carved deep by the revolution in high-performance computing, leveraging commodity components to create new and exciting capabilities at unprecedented affordability. Many studies have investigated the feasibility of graphics clusters (a large number of PCs, each with their own graphics card, own operating system, own memory, etc.) for this new frontier. To date no system has succeeded in replicating the realism, pixel density, frame rate and experience of the supercomputers that have dominated and continue to dominate this space. But next-generation versions of these supercomputers, leveraging the speed and affordability engineered by the gaming industry built into the architecture of the supercomputer, will drive not only the visualization centers but also the new revolution in visualization - the Remote Operations Command and Control Decision Centers.

Distributed visualization

Large-room visualization centers, the standard of today, are rapidly being augmented with live connections to remote or distant locations. The connected locations are desktops, other visualization centers and even handheld wireless devices that bring together all the relevant people in the decision process. The biggest impacts of this strategy today have been in well planning and geosteering of drilling wells.

The challenges of connectivity center on the robust presentation of large, interactive volumetric data, affordable bandwidth and versatility of the graphics computers generating the images. Software products synchronizing low-resolution 2-D images over TCP-IP networks are widely used and are even being popularized in the business desktop sector. But technology to interactively present large volumetric models has proven to be much more elusive. This capability is being delivered and deployed, however, into operational settings today. These systems allow asset teams to be virtually connected across long distances to ensure key decisions focus on the most relevant data and include all the right people. Bandwidth is rapidly dropping in price, bringing the low end of the spectrum into the affordable range. New tools that take advantage of this are entering the market, offering connectivity of graphics, computing, and data storage and management.

Next-generation graphics computers will become increasingly important in this process. These systems bring a revolutionary level of flexibility, real-time configurability and affordability that will make them the de facto standard backbone of the visualization centers, the distributed collaborative sessions, while at the same time drive the multitude of connected and disconnected displays in the remote operations decision centers. Their super-cluster, truly linear scalable architecture enables a single machine with large shared memory and a single operating system to drive all these environments and morph to the hourly work process variations of the operations center as needed.

Sensors and real-time analytical systems

A significant amount of information is already gathered from oil and gas reservoirs. For example, it is known how much fluid comes out of and is pumped into every well. However, potentially significant information on the dynamic subterranean world of producing oil and gas reservoirs is not collected because of the difficulty of installing sensors and the difficulty of getting information from sensors into visualization and analytical systems. Downhole fiber-optic arrays and remote, battery-powered wireless nano-sensors are paving the way to the next level of information gathering, analysis and visualization systems. As oil and gas is produced, the reservoir undergoes dramatic changes, but we have precious little real information on how it is changing, at what rate and in what areas. By extension, we also do not know how these changes are affecting today's production. Even more importantly, our ability to predict future production is dependent on our understanding of these reservoir changes. Increasing the resolution of physical measurements in time and space will add tremendous fidelity to our models.

At the heart of the sensor excitement is the development of Smart Dust, wireless sensor nodes that are scaling down to the size of an aspirin. These wireless sensor nodes (also called "motes") are, in essence, specialized microcomputers tasked with sensing, calculating, listening and transmitting. When deployed, the motes create self-configuring and self-healing wireless networks, allowing data to be transferred from mote to mote across large, geographically distributed sensor arrays. The ad hoc mote networks offer unprecedented capabilities for monitoring processes above and below the surface of the Earth through miniaturized sensors and infrastructureless communication. The application of wireless sensors to chemical processing and oil refining is well established, stripping substantial cost off of very expensive processes, increasing both yield and margin. The spectrum of operational areas where motes will have significant impact is broad; the imagination is the only barrier to how far they can be spread to improve the subsurface operation.

Surface facilities - both upstream and downstream - are the most obvious and least costly place to deploy Smart Dust. Sensors measure characteristics like flow, vibration, pressure, fluid composition, temperature and other attributes. Especially in thermal recovery operations where water slugs and non-laminar flow in steam pipes can create production issues, these sensors can effectively provide real-time information on the state of the geographically distributed system.

The convergence of next-generation graphics computers, set in a distributed asset team environment, configurable to meet the high frequency changes in the work and coupled with remote sensors and real-time analysis and graphics, is the next revolution in business performance and optimization. Nearly all the parts exist or are on the horizon, leaving aggressive yet sensible implementation the biggest challenge in reaping the benefits of these new technologies.