The digital oil field (DOF) has been a major area of investment for all the majors and a fair few others since the mid-part of the last decade. It was one of those times where all the issues swirling around the ether suddenly coalesced and screamed a common and overwhelming answer. Whether the issue was in developing difficult plays, attracting skilled labor, improving safety, mitigating environmental impact, improving day-to-day running and monitoring, or maximizing recoverable reserves, the answer was IT and lots of it.

And in a historical perspective, this was long overdue. The concept of "shop floor data capture" began in the 1970s. And in the decades since, IT has been used to monitor and record at increasing levels of detail across motor manufacture, high tech, food processing, and apparel. Even at the "craft" end of these industries, the move to digital has been inexorable: The cylinders in your Aston Martin are no longer finished by hand using apricot stones, but then they are also likely to last 321,000 km (200,000 miles).

So there are a lot of data, and they are used so that the industry can run faster, longer, and cheaper. Near real-time adjustments can be made from control centers thousands of miles away; the technology is there. But what else is there? Writing in Strategy and Business in 2008, Steinhubl and Klimchuk offered the following definition: "The digital oil field is a suite of interactive and complementary technologies that let companies gather and analyze data."

Spotfire can graphically display well relationships in a number of ways. (Images courtesy of Spotfire)

Unknown casualties in data explosion

Unfortunately those two things – gathering and analyzing – can operate against each other. The more data that are gathered, the less data can be analyzed. That is a factor of volume, but more importantly it is a matter of source. Recording and generating new data types from new systems have been necessary consequences of the DOF. But our business intelligence systems are still analyzing the old world. These systems are difficult and expensive to implement and arguably harder and more costly to change. Accommodating entirely new data sources means changing and building new structures of data – aggregations, dimensions, and hierarchies.

All of this is not impossible, but is it reasonable? If it takes a year to reconfigure corporate reporting structures, how fundamental will the changes be in the elapsed time? Is the industry creating a never-ending task, chasing its own data tail?

What the Gartner Group calls "data discovery" tools (and others call "analytics") is a more fleet-footed alternative. Data is loaded in memory with no preconceived notions of joins, hierarchies, and aggregations. Users can then design graphical visualizations of the data to explore relationships and correlations, drilling down and filtering to whatever level of detail they require.

Mighty mash-up

The source of data gives us another problem – one of combining or "mashing up" data sources from disparate systems in order to gain insight. The data may be coming from corporate systems but increasingly may also be sourced from service partners in the field. Data discovery tools such as Spotfire allow users to quickly mash up data without waiting weeks for assistance from programmers. Chevron, for example, combines its geologist-produced water injection modeling data with operating partners' real-time production data so that the water flow can be adjusted accordingly.

This is a sophisticated example of where data mash-up can produce real efficiencies, and this is becoming critical in an industry where experienced people are being lost and the risks associated with outsourcing and partnering need to be mitigated. If E&P companies can no longer rely totally on their own knowledge and expertise, then they must rely more on collating and analyzing the data in a holistic and meticulous fashion.

Data are for sharing — so is analysis

If data mash-up is the key input to finding new trends in data, then it is the presentation of the analysis itself that will broaden understanding. Put simply, methods are needed that facilitate insight. Without this, any attempt to collaborate will fail. Initially, Chevron's production partners were reluctant to collaborate. It was not until they saw and more importantly understood the results that they were bowled over.

And this is not just true of sharing with partners. What about the disconnects in individual businesses? Back and front office collaboration certainly can be aided through a common context, and so can collaboration with downstream units wanting to secure product for their refineries and meet the demands of end customers. This contextual collaboration goes a long way to promoting a holistic view of the business, and at Chevron the use of Spotfire has grown to more than 5,000 individuals across multiple use cases.

This is an area where dedicated analytics will always win over Microsoft Excel. Excel is a "personal" productivity tool, not a tool built with collaboration in mind. An analysis can be produced, but its development over time and its versions, rationale, and formulae cannot be easily understood by anyone but the author. Tools such as Spotfire allow the data analyst to build analytic apps that can be easily run by the executive on a web browser or a mobile device.

Spatial patterns of EUR can be viewed across any selected group of wells.

Do they understand?

This subject of accessibility is important for another reason. For decades the trend has been to run businesses more scientifically, yet there is often disconnect between the scientific method and the business decision-maker. When it comes to the fine detail of analysis, companies start to rely on experts, folks well-versed in statistical methods. Spotfire combines the world of the statistician into the visualization of business data. Prepackaged analytic capabilities can be used with both static data and real-time feeds to predict outcomes of actions and offer immediate suggestions for improvement remedy or risk mitigation.

For instance, using decline curves, Spotfire can readily estimate well production and estimated ultimate recovery (EUR) while producing an analysis that the engineer can work with – for example, being able to graphically "remove downtime" or see the expected yield of a new well in a particular field. The data analyst can create easy-to-use methods that explain the model and allow the non-statistician to interact with it in a jargon-free way.

Keep it visual

People are turned off by columns of numbers, and in an industry as physical as oil and gas exploration, the desire always has been to represent data in a graphical format, for example, geospatial representations of fields and wells. But true understanding of patterns comes when they can be overlain with representations of other data coming from the field.

The importance of these visualizations cannot be overstressed. Fernanda Viegas of IBM Research said, "Basically, half our brain is hard-wired for vision. Vision is the biggest bandwidth that we have in terms of sensory information to the outside world. So visualization is taking advantage of the fact that we are so programmed to understand the world around us in terms of what we see."

The DOF is becoming the graphically visualized and analyzed oil field. There's more to exploration than hydrocarbons – there also are new relationships in all those data to explore and new business opportunities to discover.