Computer breakthroughs promise to revolutionize the oil and gas industry.

The upstream oil and gas industry has long advocated and depended upon advanced technologies as a major contributor to its success. Whether it be 3-D seismic processing made possible by fast and reliable high-performance computing and storage capabilities or the ubiquity of PCs, servers, handheld devices and mobile communication devices, data and information drives the industry. Forecasts show increasing demand for oil and gas, both as an energy source and as a feedstock for petrochemical-based products. This, combined with the certainty that oil and gas is a finite commodity, increases the reliance on the manipulation and management of data and information to improve reserves and production growth whilst resulting in cost reductions and improving the economic threshold of portfolio assets.

This article outlines some of IBM's current thinking on future technologies, identifies key themes and speculates on their impact on the upstream Industry.

Technology themes and trends

Moore's law has yet to be repealed -information technology (IT) continues to develop rapidly as improvements in underlying technologies such as semiconductors, materials science and networks continue to evolve at exponential rates. In characterizing this ongoing explosion of innovation, five key themes will continue to add business value for the next decade and beyond.

Faster, better, cheaper. When looking at seismic processing and reservoir simulation, the main need is for faster, more accurate processing of increasing volumes of seismic data together with larger, more detailed reservoir models, preferably analyzed in real time. It is hoped that better analysis and modeling will improve finding success and increase recovery rates. This continues an explosion of requirements for computer power and storage within the context of fixed or reducing IT investment budgets.

So, what does US $1,000 buy in terms of compute capability? What is it likely to buy in 10 years time? As seen in Figure 1, $1,000 will continue to buy computational capability on an exponential basis - the trend of faster, better and cheaper will continue.

Figure 1 also positions the requirements for high-performance computing from geophysical applications. These applications "push the envelope" in terms of required performance, which will be a continuing future trend.

During the next 20 years, the growth of compute power will correspond to hundreds of millions of years of evolution. Deep Blue, the chess-playing computer that beat Gary Kasparov in 1997, had the compute power of 8 teraflops, equivalent to that of a lizard brain. The current rate of progress estimates that by 2015 a supercomputer will have the compute power (but not the intelligence) of a human brain.
It is a truism that data storage needs always outstrip capacity. For example, in 2001 a major national oil company had 180 terabytes of disk storage, a 200-fold increase since 1993. However, in 2002 the organization acquired an additional 160 terabytes! It has been estimated that the immediate storage requirement for a typical oil company is approximately 9 petabytes of data. Taking an extreme example, it has also been estimated that to model a reservoir of 40 sq miles (100 sq km) to a depth of one-half mile (1 km) at a resolution of 3 ft (1 m) would produce a yottabyte (1,024 bytes) of data. So what is the future of storage? Will we see similar trends?

Magnetic storage density has experienced enormous growth rates of 100% per year in the mobile hard-disk drive (HDD) segment. Again Moore's Law continues to operate, as recent IBM research projects validate. One such project, termed as Collective Intelligent Bricks (CIB), is a concept of using simple storage "bricks," each containing a microprocessor, a small number of disk drives and network communications hardware to provide a data storage system that scales to petabytes of storage.
Intelligent Devices - Smart Dust. We are all familiar with the increasing trend in miniaturization and modularization. With improvements in microchip design and increasing density of transistors on a chip, this trend will continue. By 2010 Gartner predicts there will be three embedded devices for every person worldwide. In 2001 the University of California completed the Smart Dust project to explore whether an autonomous sensing, computing and communication system could be packed into a cubic-millimeter mote (a small particle or speck) to form the basis of integrated, massively distributed sensor networks. This was successfully demonstrated in a defense environment, and by 2010 the researchers believe this technology will be ubiquitous.

From an oil and gas perspective this type of technology could have significant applications in the lifetime management and maintenance of facilities and pipelines, e.g., the coating on the pipeline could sense and communicate its status and integrity, as could other physical assets.
Analysis, integration, federation. Data and information is the lifeblood of the oil and gas industry, forming the basis for all value-adding decisions. Statistics demonstrate geoscientists spend 50-60% of their time finding and validating data. For an industry that relies heavily on good data collection and management, many challenges still exist:

• Exponentially increasing data volumes, including machine-generated data;
• Ensuring data integrity;
• Collating, combining, assimilating and analyzing numerous heterogeneous data types; and
• Effectively, efficiently and securely transmitting data within the office and around the globe.
It is a paradox that it is now easier to find and access a web file created by a child in New Zealand than to access a file created on your colleague's desktop. One of the major trends (and problems) in business is the deluge of information sources available to any computer user. The question is how to find the information that's needed and utilize it.

Information available to businesses is increasing rapidly, with online data growing at nearly 100% per year and medium and large corporate databases at nearly 200% per year. We are also undergoing a change in the type of data we are collecting and utilizing, moving from transaction and structured data (i.e., data traditionally captured in a database) to text and other authored and unstructured data such as that from sensors or multimedia, which are not amenable to traditional database architectures.
"Smart wells," "electric fields," "intelligent oil fields," the "Digital Oilfield of the Future" - these are all ideas and concepts that are currently being piloted by a number of oil companies. These initiatives exemplify the trend toward capturing and making decisions based on machine-generated data. The goal of these initiatives is to link the field development process - i.e., the subsurface characterization - to actual production in order to maximize business value. Using existing sensor and automation technologies, large volumes of data can be collected from the wellbore and other production facilities. As identified above, the key challenge is to turn this continuous data stream into meaningful information.

The quantity and information density of machine-generated data will require more intelligent access methods. For instance: a production manager will want to know whether temperature and pressure variations in a producing well indicate that sand breakthrough is imminent and that remedial action is required; the reservoir engineer would like to aggregate and analyze in real time the performance of the wells and look for trends and patterns of behavior that can assist with optimizing reservoir performance. They don't necessarily want to search terabytes of data; instead, algorithms need to be developed to allow them to quickly find the most noteworthy data and perform further analysis to satisfy their needs.
Resilient technology. Increasing reliance on technology will require robust, scalable, secure and flexible infrastructures that protect businesses from security breaches while at the same time facilitating easier business, partner and network interconnectivity. To achieve this, these computing infrastructures will grow beyond the human ability to manage them. Consequently, there will be a trend towards Autonomic Computing - self-management technologies that provide reliable, continuously available and robust infrastructure.

Specifically, IT autonomic systems will be:

• Self-optimizing: designed to automatically manage resources to allow the servers to meet the enterprise needs in the most efficient fashion.
• Self-configuring: designed to define themselves "on the fly."
• Self-healing: automatic problem determination and resolution.
• Self-protecting: designed to protect themselves from any unauthorized access anywhere.

Real-time business - real time decisions. The technology trends summarized above will support and drive businesses and enterprises to become "real" real-time organizations. Businesses will become on-demand; strategic business decisions will be made and based on a continual flow of new information. This will require business processes to be integrated end-to-end across the company with key partners and suppliers to enable the enterprise to respond rapidly to any customer demand, market opportunity or external threat.

This will be facilitated by better models and algorithms supported by faster hardware, leading to the availability of real-time integrated data, which will ultimately lead to universal connectivity for immediate decision-making.

Technologies such as web services and GRID computing will facilitate complex sets of distributed services that will appear as though they exist and run on a single "machine" - a virtual computer. New applications will be written for the virtual operating system, compute and data engines that will provide easy, rapid integration of business processes and information. How are these on-demand technologies impacting the upstream industry now?

Reservoir characterization blends science with graphics art, using the insights and experience of a range of professionals. The ability to share a common view of the reservoir, together with the assumptions used in its characterization, leads to better decision making about its likely behavior and optimal ways to effectively maximize production. This environment will be supported by remote visualization systems that can operate over bandwidth-constrained networks.

GRID is distributed computing platform over a network (e.g. the Internet), linking servers, clients and storage to dynamically form virtual servers and storage pools, supporting the creation of virtual organizations, both ad-hoc and formal. GRID is based on open standards such as Open Grid Services Architecture, which help protect infrastructure investment from shifts in technology and changes in business models. In the upstream environment, GRID can deliver uniform computing systems and optimize IT assets for all sites currently performing geoscience, engineering, technical and traditional back-office business computing.

Conclusions

Technologies have and will continue to dramatically change the face of the upstream business environment and will continue to evolve from providing "back office" supporting functions to definitive mission-critical business requirements that deliver real business value and support changing business models. IBM believes the high-level trends described will facilitate a transformation of this industry, the question being when will this occur and how long will it take? The inhibitor to transformation will not be technology but will be cultural, organizational and, above all, human.