Deloitte’s examination of the incentive to integrate sensing, communications and analytics technologies in the oil and gas industry a couple of years ago noted that “increased data capture and analysis can likely save millions of dollars by eliminating as many as half of a company’s unplanned well outages and boosting crude output by as much as 10% over a two-year period.”

Such promising statistics have obviously resonated within the industry.

Reuters recently reported on energy companies now “envisioning billions of dollars in savings.” The article included an interview with ConocoPhillips about its drilling in the Eagle Ford shale basin in Texas, leading to “billions and billions of dollars” in savings. The article quoted ConocoPhillips’ use of TIBCO Spotfire for analysis of data from well sites.

The coverage included Ernst & Young’s examination of 75 large oil and gas companies, which “found that 68% of them had invested more than $100 million each in data analytics during the past two years.” Further, “nearly three quarters of those firms planned to allocate between 6% and 10%of their capital budgets to digital technology.”

The need for enhanced business intelligence and operations is driving this surge, including access to new Internet of Things (IoT) methodologies for the collection and analysis of sensor data. Innovations in data science—including visual analytics, machine learning and geoanalytics—serve to unlock unprecedented value. Such IoT analytics align, aggregate and normalize data to develop features from sensor data that enable surveillance of operations and resulting “in-time” interventions drive significant value to the business.

In practice, software representations of physical assets, processes and systems such as digital twins are used to complement the cycle of exploratory visual analytics, numerical encoding, and embeddable predictive models. This “insight-to-action” process provides statistical process control and intervention. First-order analytics implementations in energy companies target production surveillance, condition-based maintenance, environmental health and safety, and drilling optimization. These implementations enhance production by keeping systems running smoothly and prevent failures to extend equipment life; directly impacting revenue and top-line growth while attending to safety and environmental concerns.

Consider a specific case of oil and gas production. Electrical submersible pump (ESP) systems provide reliable artificial lift and are deployed to optimize production in wells that produce the most oil. These big-production assets are typically highly instrumented. Sensors on ESPs monitor electrical parameters, pump intake pressures, temperatures, surface pressures, etc. — constantly during operations.

Exploratory visual analysis can target degradation and non-productive time, and identify leading indicators for these problems such as changes in pressure and current profile from the ESP sensor data. For example, an increase in pressure and decrease in current can indicate plugged tubing or blockage somewhere in the flow.

Data scientists can analyze historical data on pressure and current to develop rules and models that characterize the pressure and current profile as outlined above. These rules and models can be back tested to ensure they have low false positive rate and then pushed to the organization’s services architecture and streaming platform for application to incoming sensor data. As data moves through the system, violations of those rules and thresholds trigger notifications, which can range from urgent alerts to field staff indicating imminent equipment failure and immediate shutoff requirements, to less-time sensitive warnings concerning minor degradations in overall flow. The notifications can be stored and further mined to identify consistent “bad-actors” in the operation such as particular equipment components and third-party vendors and to prioritize future strategies for further investigation.

This process of remote equipment surveillance is how data-driven energy organizations can digitally manage equipment productivity, degradation and stoppages. The embedding of numerical models encoding such patterns enables continuous monitoring, decision support and automated interventions when degradation is detected, thus driving enormous operational efficiency improvement and ROI.

There are challenges. Companies must be able to robustly access and wrangle their data in a way that is robust yet flexible to changes in their IT systems. Data pipelines must be set up to enable anomaly detection on historical data as well as continuous improvement of models and rules to reduce false positive rates, while ensuring the true instances of degradation are identified. This includes models for trading off the cost of missing degradation, with the cost of investigating the notification. In order to manage the notifications to resolution, a business process management system should be deployed for logging the identified issues and enabling case management to resolution.

It’s this combination of data access, data wrangling, analytics and intervention management that enables an organization to surface insights and take appropriate actions. Connecting data with analytic intelligence, targeting a business problem, can result in massive savings to the operation. The increasingly digital energy industry is ripe for expanded implementations.

Michael O’Connell is chief analytics officer for TIBCO.