Use of analytics is best realized as part of an enterprise framework, where its power can be brought to bear in ways that become clear only as its use grows. (Image courtesy of SAS)

Over the last decade, the computerized application of statistical and other type analytics to a broad range of industrial disciplines has taken what were previously “arts,” —
i.e., pursuits governed by the intuitive insights of experienced individuals — and made them something more closely resembling empirical science.

The impact of analytics already clearly is seen today in the process, petroleum refining, and manufacturing industries. But analytics also are increasingly being applied to what still remains very much an art — improving production, recovery, and efficiency in upstream oil and gas operations.

“Introducing an engineering team to a resource always increases recovery by 3% to 4%,” said Patrick Pouyanné, Total senior vice president, strategy business development, engineering R&D. “But with all the complex interactions involved in determining recovery rate, veteran engineers need to be involved.”

Yet at the recent 2009 TIBCO Spotfire User Forum, one of the more interesting presentations came from Erin Van Volkenburgh, a young Chevron reservoir engineer and Spotfire whiz, who has been using analytics to optimize nitrogen gas injection across a gas-capped oil field to improve recovery through pressure maintenance.

TIBCO Spotfire is a vendor of enterprise business intelligence and analytics applications. Other analytics vendors, including SAS Institute, also are applying analytics to oil and gas field challenges.

In petroleum field exploration, Petrobras is using SAS Analytics to identify rock breaks that produce oil or gas. Petrobras says use of analytics has significantly increased output and helped to recuperate hydrocarbon supplements. It also is using analytics software to evaluate and prolong the life of mature fields.

In addition, more traditional oilfield vendors, such as Landmark Halliburton, Schlumberger, and Baker Hughes, are well aware of the power that analytics can bring to bear, most especially when applied within integrated production or asset models, a key element of the “digital oil field.”

Setting the stage

Defining exactly what is meant by “analytics” can be a bit tricky because there is a considerable gap between specialists’ use of the term and what it connotes in the marketplace. “In the marketplace, analytics are tools that deliver insights, as opposed to just looking at reports,” said Brad Hopper, TIBCO Spotfire senior director, industry solutions. “This could mean anything from the ability to configure a visualization to statistical tools that work in conjunction with that visualization.”

In this view, analytics are differentiated from business intelligence, a capability optimized to answer “known” questions, e.g., sales by region. Use of something like an online analytical processing (OLAP) cube takes it a step further, allowing detailed examination of the known question. Analytics, on the other hand, brings with it the ability to answer ad hoc questions — the questions you didn’t know you needed to ask until you came upon them.

In the oil and gas industry, analytics are often tied up with the use of integrated production models, to simulate well and reservoir behavior in pursuit of increased recovery.

According to Keith Richard Holdaway, principal solutions architect, SAS Global Oil and Gas, statistical and spatial analytics applied to historical and real-time data can provide valid approximations of environments that potentially could be more exhaustively derived using first principles.

“Analytics has become a more focused area within oil and gas,” Holdaway said, “because extant reservoirs are coming to the end of easy oil, leading to reservoir characterization projects. To look at the reserves, we combine historical context — data accumulated over many years — with real-time data. Use of spatial analytics gives us a ‘realization’ of the reserves that helps us better understand possible recovery methods.”

Most reservoir characterization projects include data sets collected by geoscientists from several “siloed” disciplines. This leads to very robust models and multivariate analyses. But combining these data sets, as well as such things as well logs, seismic traces, and core samples within the model presents its own challenges. For one, each has a different scale.

“You need to aggregate and cleanse the data, manage those scales, and map to the reservoir capabilities,” Holdaway said. “This leads to use of things like neural networks and fuzzy logic.”

More case uses Chevron’s use of Spotfire identified low-pressure, under-injected areas within the reservoir being optimized by Van Volkenburgh and predicted the expected impact of operational changes. The resulting injection modifications flattened the field decline rate for markedly improved oil and natural gas recovery.

Van Volkenburgh said she was able “to determine the efficiency of increased injection by analyzing gas injection over time and across the field to identify patterns and potential improvements.” The analyses also were able to quantify the risk to recovery from an associated aquifer and demonstrate relationships between injectors and producers.

Van Volkenburgh added that the use of Spotfire solutions was instrumental in demonstrating to unit managers and partners the reservoir’s performance characteristics, although it took time and a number of presentations for them to gain confidence in what they were being shown.

Additional presentations at the Spotfire event revealed other interesting applications.

HighMount Exploration and Production developed a production analysis tool using Spotfire to help optimize production on more than 6,500 wells in the Sonora Gas field.

John Argo, senior drilling engineer, Exploration & Production LLC, said the tool has aided the recovery of production due to downtime and well loading. It also helps update the impact of new drills versus base gas, recompletions, pumping unit installs, and other special projects.

Hess Corp. is using Spotfire to analyze trends across various data types taken from its Peleton WellView information management system. According to Kimm Cashiola, a Hess senior drilling tech, this previously was done using Pivot Tables within Excel. In Cashiola’s opinion, Spotfire is easier to use and has been adapted to allow Hess to use its own calculations.

Expert testimony

At the recent Digital Energy Conference, an executive panel meant to address the status and relevance of digital oil field strategies was asked what technology amongst those typically associated with the digital oilfield concept could best be characterized as “transformational.”

David Latin, vice president, E&P technology, BP, answered that the most transformational are “Integrated asset models and the whole issue of data analytics and turning that into something that can be used. Corporate knowledge and intellectual property are embedded in its data. Using data to drive physical models — empirical physics — can get you to an answer that can be used, faster than basic theory.”

Nodding in agreement were Mike Hauser, i-field manager, global upstream gas, Chevron, and Russell Spahr, associate engineer, Exxon Mobil, who said, “The integration of full asset models as a system, as we think through what real time is for each separate function, will be significant.”

A brief survey of the exhibition floor at the conference uncovered information from Landmark Halliburton on its DecisionSpace for Production; Schlumberger Information Solution’s Petrel seismic-to-simulation software; and Baker Hughes’ soon to be announced foray into the production modeling space.

Dan Vesset, vice president, business analytics research, IDC, said, “IDC research has shown that the return on investment in advanced analytics, including data mining and predictive analytic tools, outpaces that of information access tools. Advanced analytics facilitate better understanding of past complex relationships in order to better predict the future.”

The impact of quantitative analyses already extends beyond industrial applications to impact many diverse aspects of modern life. In sports, it has uncovered the unique talents of role players whose contributions don’t always show up on traditional stat sheets. It has led to the defeat of even the grandest of chess grandmasters by computers like IBM’s legendary “Big Blue.” And on Wall Street the so-called “quants” have made billionaires of individuals formerly derided as nerds, while at the same time, in failing to foresee today’s “unique” economic landscape, have given new credence to the old adage: “garbage in, garbage out.”

While interesting themselves, these examples point to the fact that analytics are best applied in closed systems with a defined, albeit capacious, set of variables — like a chessboard or a basketball court, as opposed to the global economy. By the same token, the circumscribed world of a refinery is more susceptible to statistical analysis than the bespoke environment of an oil reservoir.

Understandably then, though progress is being made, skepticism remains as to how much can be done, analytically speaking in the upstream petroleum sector. Other issues, such as the amount of investment needed to keep models current with changing conditions, remain to be addressed.