Global energy demand is expected to increase by 50% in 25 years. While nuclear, solar, and other fuels are gaining traction as alternative ways to meet this demand, it is fossil fuels that will remain a significant source of the world’s energy for the foreseeable future.

Meeting this rapidly growing demand is a real challenge for oil and gas companies — but it is not the only one. Volatile supply, increased compliance and risk regulations, and heightened environmental concerns are exerting major pressure on the current methods in the upstream and downstream oil and gas markets. Forward-thinking companies are implementing change across the value chain. Their goal is threefold: get to first oil faster, increase recovery rates, and sense and solve problems before they start.

Data analytics aid in reservoir management. (Image courtesy of IBM)

Data analytics aid in reservoir management. (Image courtesy of IBM)

This quest has gone far beyond installing new machinery or exploring for better sites. Technology and datadriven functions are being applied to drilling and completion, reservoir and production management, and maintenance. Yet gathering data is only the first step. One oil field alone can generate the equivalent of 200 DVDs worth of data a day, and little of these data are connected. Making sense of all this information is critical for better decision- making about exploration, production, and management.

Fortunately, this is not some future state. The systems that underlie our oil fields are becoming smarter. Advances in deep computing are enabling scientists to discern fields that previously were invisible. Autonomic sensing technologies and data analytics are being used to identify viable reserves, increase the amount of oil being extracted, improve productivity, and anticipate problems. These technologies are enabling an evolution to smarter oil and gas across three key areas: smarter exploration, smarter production, and smarter reservoir management.

Smarter exploration
Smarter exploration means integrating and processing geophysical and other relevant data to develop 3-D models of reservoirs. Developments in deep computing are pushing 3-D seismic modeling into the next generation, enabling scientists to discern previously inaccessible oil and gas reserves embedded beneath difficult terrain or the deepest ocean waters.

Seismic imaging can be used either to find petroleum or to better characterize a producing reservoir. For example, an oil company might use seismic analysis every few months to monitor the progress of the oil production. Once the analysis of an area yields enough data to improve the reservoir model for forecasting future petroleum yield, algorithms can be applied to the data to help find the oil, or high-speed computer capabilities can help accelerate algorithmic processing and increase the efficiency of the existing algorithm. Using basin models, an entire area’s geological history can be modeled, providing an estimate of how much oil was generated and where it might have migrated. Oil companies then can combine these calculations with physical processes and geochemistry findings, reducing the risks associated with finding oil.

Smarter production
Smarter production means capturing information about the volume and quality of oil and gas reservoirs before a new well is drilled. Through the use of modeling and data analytics, companies can minimize the drilling footprint and exploration risk while improving the safety and reliability of operations. One US-based firm is using seismic data and rock physics inversion to create a comprehensive, integrated view of potential resources. Similar to performing a sonogram, inversions are helping scientists determine the configuration of the earth in a given location. Multiple models are run and compared to reduce mismatches between observations and improve the accuracy of field management decisions.

Smarter reservoir management
Smarter reservoir management makes use of sensors embedded across pipes, pumps, and an entire field to generate data that can be compared against historical trends and applied to help optimize well performance. An intelligent field can even monitor itself while being run by a team of virtual offsite experts around the world. This can significantly reduce strain on the workforce by reducing the amount of time employees spend under tough environmental conditions in the field.

Deep computing
Core to the success of advanced analytics and modeling is deep computing technology. In modeling, for example, as newer data acquisition techniques are combined with more computeintensive seismic algorithms, the pressure to increase computing capacity is exponential. Continued innovation in processor speed is needed to drive universal use of modeling technologies in the oil and gas industry.

The advanced analytics driving the transformation to smarter oil and gas are fueled by innovations and technology transfers from other industries. KTH Royal Institute of Technology in Sweden is using IBM’s streaming analytics technology to gather real-time information from GPS devices on nearly 1,500 taxi cabs in the city. The institute soon will expand to gathering data from delivery trucks, traffic sensors, transit systems, pollution monitors, and weather information to provide residents with realtime information on traffic flow, travel times, and the best commuting options. MareNostrum, the same supercomputer used to compute 3-D seismic images for the oil and gas industry, also has been used for human genome research, protein research, weather forecasting, and the design of new drugs.

Technology transfer can be a two-way street. For example, the MareNostrum is outfitted with a special high-end computer processor used in Sony’s Playstation 3 games console to hunt for oil in the Gulf of Mexico (GoM) – and the results are fed back to help develop the next-generation gaming machine.

This hunt for oil by Madrid-based Repsol in partnership with scientists from around the world is using advanced seismic imaging technology to reveal oil and gas deposits that traditional imaging techniques cannot see. For its exploration activities in the deep waters of the GoM – a region known for complex geological conditions and salt structures that create noise in the seismic data – Repsol needed far more complex algorithms to drill accurately and avoid dry holes; these costs can exceed US $125 million. The solution was to apply a far more powerful algorithmic approach known as reverse time migration, which previously had been too compute-intensive to be used by oil companies. Repsol worked with IBM and other companies in the chemicals and petroleum industry to build a powerful new system capable of running the next generation of more accurate seismic algorithms. Public benchmarks show that the processors used in this solution perform the computation of the necessary algorithms 40 times faster than leading brand processors. By leveraging advanced multicore technology and optimizing algorithm code for maximum performance, Repsol now can spot likely opportunities for oil and gas discovery more accurately and bring it to market faster.

Norway’s Statoil is using analytics to achieve smarter reservoir management. Despite a daily equity production estimated at 1.95 MMboe, Statoil is continually searching for ways to offset the natural trend toward declining production levels. Statoil sought to increase its recovery rate to 55% for subsea platforms and 65% for fixed platforms, even as the worldwide average is 35%. The solution is to link real-time sensing capabilities in the field with collaborative analytics systems.

Integration infrastructure is another important area for innovation with data at Statoil. The TAIL-Integrated Operations research project was initiated to improve operations at fields approaching the end of their lifetime. To connect subsurface data among disparate offshore platforms, IBM extended its WebSphere integration software with manufacturing domain adapters and an integration industrialsemantic model based on a linkage of key oil and gas standards. Recently Statoil and IBM announced a three-year project to implement this Integration Infrastructure Framework across the enterprise, enabling exchange of information across Statoil installations for faster, more accurate decision-making.

IBM also is working with Shell to explore advanced techniques for reconciling geophysical and reservoir engineering field data. By applying improved algorithms, analytics, and accelerated simulations, Shell will be able to extract natural resources with more certainty and efficiency, better optimizing the recovery of oil and gas. The complex process of reconciling often differing views of oil and natural gas fields can take several months to complete and involves measurements of production volumes, flow rates, and pressures – including time-lapse seismic data from subsurface rock formations, well and laboratory data, and seismic data covering wide spaces between the wells. Shell and IBM will reformulate and automate the task of reconciling the different data and create an enhanced mathematical optimization solution that has the potential to make the data-inversion process significantly more cost-effective.

These are just a few examples of oil and gas companies thinking and acting in new ways to take advantage of a more instrumented, interconnected, and intelligent world. If petroleum geologist Wallace Pratt was right and oil truly is found in the minds of men, the complexities of this industry make it clear that oil fields must become smarter too. Advanced analytics and deep computing can help ensure that data can be turned into actionable insights, helping oil and gas companies transform the hydrocarbon present for a more renewable future.