Oil and gas companies are the original big data users. Data drive every facet of the oil and gas industry – targeting geographies, land acquisition and leasing, E&P, and operations. Oil and gas companies have always invested in big data, from seismic analysis to reducing risk to increasing production.

Data alone, however, do not deliver the answers companies need to ensure profitability and safety. These companies require solutions that enable them to accelerate speed to answer. Therein lies the true potential of big data for the industry and the key to achieving an intelligent enterprise.

Increasingly, companies are focused on achieving intelligent operations to enable real-time analysis of drilling and production data streams. As companies begin to examine data management, many face challenges such as disparate data sources and the lack of analytical capabilities. New approaches such as high-performance data management are emerging that enable oil and gas companies to harness the power of big data to improve speed to answer.

Data challenges abound

Oil and gas companies need to improve their ability to capture massive amounts of data, provide insights through data analysis, and make better operational, tactical, and strategic decisions. This, in turn, enables operations personnel to reduce nonproductive time (NPT), optimize output, reduce costs, and improve safety.

Business intelligence (BI) systems have been part of the oil and gas industry for many years. Companies have invested in traditional BI and analytics solutions that have enabled them to consolidate financial and operations data, replace extensive manual reports with dashboards to support all horizontal business processes, provide users with ad hoc reporting capabilities, and create a centralized financial budgeting process with increased expense visibility. There are, however, a number of issues that prevent oil and gas companies from effectively and efficiently achieving these objectives.

The first issue relates to data. Massive amounts of technical, operational, and financial data are stored in disconnected systems, precluding visibility across various stakeholders and not enabling a 360° view of the operation.

The second challenge involves the need for expanded analytical capabilities. Isolated metrics and key performance indicators such as production information provide a limited picture of the assets. Further, members of the asset team have a limited ability to analyze data to achieve the best insights on how to optimize drilling and production operations.

Companies also face execution challenges. They are under growing pressure to make more real-time decisions about increasingly complex wells with fewer, less-experienced personnel. In many cases data are stored in multiple systems, and stakeholders in the field and at corporate headquarters often find themselves unable to leverage enterprise data to improve the timeliness and quality of well-level decisions.

The promise of accelerating speed to answer

Accelerating speed to answer provides clear benefits in terms of efficient operations and lowered spending costs. For example, if a company reduces speed to answer from 6.5 hr to 2 minutes for production and operational data, it will be able to increase production and lower costs while achieving safe operation.

Oil and gas companies have the opportunity to answer questions in seconds rather than days and in days rather than months. This acceleration, in turn, can allow businesses to answer questions that have resisted analysis, develop test-and-learn processes that quickly adapt to the operating environment, and automate complex workflows.

To capitalize on the potential of faster speed to answer, oil and gas companies need to reduce the lag time between data capture and data analysis as well as the time between data analysis and decisive action.

To become more proactive, companies must eliminate the negative impact of disconnected and manual systems that prevents them from using information to execute on proactive decisions. Companies require systems that enable real-time operational data capture and data analysis and can support “right-time” decisions and seamless decision execution. Operators need to implement this framework across reservoir characterization, drilling, and production.

Oil and gas companies also require a rapid analysis process that empowers them to quickly react to the real-time drilling, production, and other operational data flowing into their systems. The data can be generated by sensors on a well, inputted manually, or come from partners or contractors working on the well. The analysis should be further enhanced by leveraging data housed in the enterprise data warehouse, including static and real-time data across existing and historical wells as well as enterprise information such as financial data. Those data can provide context that enables the asset team to generate timely and proactive operational recommendations such as a well intervention solution type. Asset teams can then upload the drilling or production data and analysis results into the data warehouse to continuously enrich the value of those data. This provides the foundation for a continuous improvement loop that enables the asset team to optimize well operations.

The enterprise data warehouse should contain its own data analysis capabilities that the asset team can leverage to drive tactical and strategic decisions. Drilling operations always will use familiar master datastore data types such as seismic data, logs, well tests, and pressure-volume-temperature analysis as well as production surveillance workflows for analysis. Being able to analyze a multiwell archive of real-time data in the same data environment using the same data analysis tools will allow new and potentially unforeseen analysis on a longer timeframe. Historically, most producers have focused their data management improvement projects on the reservoir characterization process, and these initiatives generally make it easier for users to find the data they require. That said, however, many of these systems cannot scale – in terms of volume, data model expansion, and performance – to accommodate rapid acceleration in data acquisition. The exponential growth of data volume in seismic, well logging, and interpretation results is well known. When real-time data are added to the characterization process, the problem becomes exponentially more complex.

Data model scalability issues relate to the ability of legacy systems to handle new types and formats of data such as special core analysis, microseismic, geochemical, electromagnetic, gravity, paleo, distributed temperature, and vertical seismic profile data. These datasets are stored in their native format on tape somewhere or live within the siloed application used to analyze them.

The final dimension that presents a challenge is performance. Existing systems simply do not have the computing horsepower to conduct the kind of continuous analysis an optimized enterprise would need.

The answer

The solution would be to implement a system that enables organizations to:

  • Analyze massive amounts of real-time data;
  • Make intelligent recommendations based on everything an operator knows about the asset; and
  • Deploy the solution at a low total cost of ownership using industry standards to promote internal and external collaboration.

The key is to use a commercial off-the-shelf solution that can be deployed leveraging an organization’s existing distributed control system (DCS) and SCADA investments. To provide the flexibility that operators require and to further extend technology investment, the software should be able to be deployed on top of a rig-site aggregator (in the case of drilling data), an historian (in the case of production data), or directly on top of the DCS or SCADA system. In other words, the software should provide additional capabilities on top of an organization’s instrumentation layer. A solution should include:

  • The capability for real-time drilling and production data to be converted to an industry standard-based WITSML or PRODML data feed and loaded into a high-performance and highly scalable engineered infrastructure solution that will provide the ability to screen the incoming data using complex event-processing capabilities. The results of this real-time analysis can then be visualized by a business activity monitoring web application;
  • The ability for data to be stored in a data warehouse appliance based on a public petroleum data model or whichever proprietary data model the oil and gas company prefers. The data warehouse should contain all well-based and subsurface operational, technical, and financial data necessary to optimize drilling and production operations, including traditional data such as well logs or financial information;
  • The flexibility to support in-database analytical tools, which the asset team will be able to use to drive tactical and strategic recommendations;
  • Real-time database replication in the event that the data is stored on a distributed database network. For example, the norm for global enterprises is for the “master” copy of a piece of data to be stored in the country where the affiliate operates, and the local database is replicated back to either the headquarters or to a research center; and
  • A robust analytical suite that delivers “speed-of-thought” insight by leveraging in-memory analytical capabilities for extreme performance. This BI environment is where the financial aspects of the enterprise are brought together with the technical aspects to enable any engineer or operations person to make an appropriate decision.

Intelligence is one thing, but speed to answer is another – and both are essential for success in today’s oil and gas sector. By deploying a high-performance data management environment, oil and gas companies will be equipped with the real-time, enterprise-wide information and insight they require to optimize performance and reduce risk.