The third and final installment of this series describes the technology components of the Intelligent Oilfield (IOF). It should be noted that technology is much more than merely computing power, although all the technology discussed hereafter is related to data and information gathering, transmission, use, storage and access. For the purpose of this article, the scope of IOF-related technology begins at the end-device that collects a signal of any physical variable, such as pressure or temperature. Traditional oilfield technologies, such as 4-D seismic or stimulation techniques, are considered out of scope.

In the IOF, data and information form the lifeblood that sustains the life of operations, where rates, cuts, pressures and temperatures are the most basic data points. Many companies today collect data, at varying degrees of frequency and latency, for “real time” decision-making. A genuine IOF approach demands that all valuable data streams be gathered, delivered and analyzed with a compatible rate of recurrence. Adequate data capture for new fields or “green fields” can be relatively simple while it is more demanding in mature “brown fields,” particularly in areas with a high density of wells.

Today, technical professionals may be spending as much as 60% of their time managing data before they are able to focus on the more critical, “value-added” work such as analysis and decision-making. And additional data may exacerbate the problem. Thus, a potential unintended consequence of focusing on only one component of the IOF framework at the exclusion of others is poorer productivity and higher costs.

Data gathering and control, data management and infrastructure, and integrated systems and applications are the key IOF technology components that turn progressively into actionable, usable information and knowledge with each subsequent component. Ultimate value of the data, once collected, is garnered once it affects, or is affected by, upstream work processes and individual and group behavior.

As each of these components is translated into specific projects, business cases need to be developed. It can be very difficult (and possibly invalid) to create a specific business case for each point solution. However, bundling individual projects or point solutions into a larger program or system can yield more credible and accurate estimates.

Selecting & capturing the right data

Selecting the right data for the right variables can be both obvious (such as production rates) and illusive (such as with complex machinery). Furthermore, the proper collection frequency can be as critical as the selection of the variable, as costs can vary for collection, transmission and storage. The easy answer during design activities of a new project (“green field”) is to measure everything possible at the highest frequency possible. With today’s instrumentation costs, the overall effects on large project economics are minimal. Additionally, wireless devices capable of collecting and transmitting data can even further reduce the cost of instrumentation. Thus, the simple answer is to measure all possible parameters at a high level of frequency. Over time, however, data capture frequency should be evaluated and modified based on two criteria (see below), particularly as data volumes grow.

Data should be collected at a frequency that is driven by two factors:
the frequency at which evaluation and action can be reasonably undertaken (the “impulse-response” time) and the frequency at which long-term analysis
will yield key insights.
Still, a philosophical dilemma exists around the data frequency decision. Even in the event that no immediate action can be taken (e.g., choke or valve position changes, workovers), there still may be a benefit to collecting, analyzing and recognizing that an actionable need exists. The key factor is that there is a process to evaluate the appropriate data and collection frequency at some regular interval.

With so much data streaming from so many end devices at such frequent intervals, it is important to ensure that all real-time data is as fully leveraged as possible — as it is collected. Otherwise, it may be considered perishable, and of lesser value. Through the use of smart analytics and business intelligence tools, incremental business value can be realized as the labor-intensive human factor is removed from the initial analysis. Powerful analytics are merely the reflection of the collective wisdom of the organization encoded into a Supervisory Control and Data Acquisition (SCADA) or Distributed Control Systems (DCS). Notifications or alerts can be reviewed and analyzed by humans once the exception is identified.

There has also been much publicity about video, but it has yet to be fully realized as a must-have surveillance tool. However, it can be used effectively to increase the reach of a limited human resource base, reduce human exposure and increase safety.

Moving and managing

Once gathered, IOF data must be transmitted, sometimes over vast distances. Increased collection frequency, including video, leads to greatly increased bandwidth requirements. In turn, this requires a robust data management approach. Though much of the initial evaluation of the data from the end devices can be accomplished through remote terminal units (RTU) or program logic controllers (PLC) as part of the Data Gathering and Control component, much of the data will need to be transferred from the field, or data source, to a more permanent data store for more complex analysis at the field or reservoir level and over longer periods of time.

It is important to note that the transmission approach, both from the end device to the control unit and from the control unit to a central data store, should be able to accommodate these increasing volumes of data. Some form of compression may be required for the vast amounts of real-time data, but some of it may not be required for longer-term storage. The good news is that increases in satellite communications and fiber-optic lines improve transmission in almost all scenarios to serve these purposes.

Attaining one source of truth

Robust, repeatable decisions can be enabled by data that is not dependent on its storage medium or location. Currently, data sets in most upstream companies are duplicated in a variety of databases — field-based historians, application-specific databases, or corporate data warehouses. Each has its own use and purpose, yet their separation inevitably leads to fragmentation and inconsistency of the data sets. These different data sets, compounded by inaccurate measurement at the source, can yield different decisions. This decision variability destroys value, not to mention the excessive effort from the users to secure and manage these various data sets and the wasted time in reconciliation of data between sources.
Integration frameworks like Service Oriented Architectures (SOA) or middleware are allowing companies to publish this “single version of the truth” regardless of the applications that rely upon the data. Regardless of the integration methodology, it is critical to collect the sensor data in a consistent and standardized way, with consistent naming standards, a standard taxonomy and ready access to all “consuming” applications in the organization that need this data.

Conclusion

The IOF can be an enormous catalyst for change; it drives maximum efficiency from an oil company’s existing technologies and processes and may even force some re-evaluation and consolidation of information-related systems. In an IOF environment, applications need to interact with each other and use the same data. The users in the global workforce and working in remote operations centers need a standard set of tools. Otherwise they may find themselves mired in a patchwork quilt of processes and technologies, as various groups use different tools and derive different decisions.