Over the past few years, as energy companies have attempted to plan and execute their projects, the operating environment has been anything but stable. In fact, volatility has been the one constant on which companies can rely. Given the mix of price uncertainty, environmental and regulatory hurdles, and the continuing challenge of overseas operating environments, some organizations have begun to move back into operating areas that provide a higher degree of stability. The irony is that this “rediscovery,” particularly in the US, has presented companies with a new set of challenges, especially when their operations are not geared to the realities or limits inherent in these fields.

Just as a country’s political climate or the realities of a challenging environment, such as deepwater drilling, will often dictate how a project is to be planned and executed, companies seeking more stable operating environments in the US must adapt the management of their projects. This is also true for those companies looking for a way to operate “leaner” – to execute in a more efficient manner. Unfortunately, for many companies, “lean” is often tied to cost cutting rather than improving operations to get the most for the money already allocated.

The data paradox
Technology limits associated with US operations are not restricted to the limits of above- and below-ground oilfield equipment and tools, reservoir mapping techniques, and well logging technologies. Rather, US operations also have been significantly limited by something as basic as the poor use of operations-related data management processes and tools.

The Marshall-Teichert Group

Oil and gas companies are rich in data. They have abundant processes and systems in place to collect information and operating budgets for IT investment. The disconnect in the use of information occurs because many operators do not take the next logical step and turn the data they are warehousing into useful business information.

Data is simply a collection of bits and bytes, important and unimportant facts and numbers. Information, on the other hand, is the result of turning all that data into something useful – something from which conclusions can be drawn and operations maximized. More specifically, it is the process of using the data to spot operating deficiencies, which is the basis for the identification of process improvements.

A common theme among a number of US operators, both supermajors and independents, is the tremendous volume of facts and data about individual wells, fields, and basins the companies have at their disposal. They have captured daily information about drilling and workover activities – sometimes to an extraordinary level of very specific detail. Moreover, many of the service companies with which they have contracted have collected additional details, often at the wellhead or rig, which remain largely untapped.

Operators have an enormous amount of information in hand, but most have not taken the next critical step of turning all that data into what could be a transformative business process with the potential to significantly optimize their operations.

New technologies, business processes, and tools have greatly improved in the upstream sector over the years. However, new technologies and tools alone do not necessarily bring the anticipated or promised benefits. Organizations fail to give thoughtful consideration to “how the tool is used” and, instead, focus on the tool itself. When looking at how companies employ the use of new tools, three observations can be made:
1. The tool or technology must first be tied to a bottom-line business objective. Data collection for data collection’s sake will not take an organization far.
2. Those using the tool or technology must see and realize the benefit. Why is the technology important to the individual user and the organization?
3. How the tool is used is more important than the tool itself. In other words, the elegance of the tool is not nearly as important as how effective the tool is in helping the organization optimize US operations.

Data warehousing to data use
If an organization is going to leverage the data tools at its disposal, key performance indicators (KPIs) that include cost, execution, and performance against targets must be developed and agreed upon by core functional groups within the organization with clear business objectives in mind.

Oftentimes, there is a robust set of metrics tied to HSE performance, while those tied to business objectives, such as cost, cycle time, and production uplift, are unclear or absent. The result is an organization collecting unreliable data in an attempt to make intelligent decisions, while failing to measure themselves against those metrics that affect business performance.

If the business is aligned or has a system in place to measure performance, the right data fields can be identified along with the appropriate gathering criteria and protocols to populate the KPIs. Protocols are a critical element applied to data entry to ensure that data being collected drives the KPIs. Initial training must take place at the entry level, with a heavy focus on what happens to that data once it is entered into the system. If team members cannot see how their data collection efforts are translated into data analysis and how that data is used intelligently for future planning and execution decisions data integrity will suffer.

Outside of HSE, many organizations do not have the right level of accountability when it comes to project performance. If there is not a consistent process in place to examine cycle time, cost, production, and other performance metrics, the wrong decisions will be made in the attempt to maximize field life, especially in the US. If the right data is important to senior managers, it is important to their direct reports and so on, down to the field personnel that input the data. Senior management can achieve buy-in by demonstrating how the KPIs are being used to improve the planning and execution of project phases.

Ongoing training should be conducted at a face-to-face level to ensure clean data and solid understanding of the process. Often, initiatives fail over short periods of time as they are rolled out in a one-time training session with the directive of “utilize online support” when needed. Experience shows that online support is generally inadequate, if the end users can even find what they need. Moreover, there is a fundamental disconnect between the IT group and the field groups they are tasked with supporting. For many organizations, the IT group does not understand the operating realities project team members face. The fallback is data warehousing without a system to analyze information and make adjustments.

That system, including regular interval look-back reviews using the KPIs, should involve all groups associated with the outcome of the project (including vendors) to gage performance against targets, learn from past performance, and plan for improvement. Improving business performance requires expectations to be examined against realities. A company must identify the gaps, work to close those gaps, and demand compliance from all levels to ensure sustainability of the KPI process.

Commitment to quality
Data utilization and the associated tools used to collect and disseminate it within an organization are critical to bridging the gap between planning and execution. Achieving this goal requires a commitment to examining information to evaluate performance and having accountabilities in place to ensure follow-through. If an organization is not committed to using its data to impact performance (instead of merely storing it in an in-house repository); the data will have limited impact on execution even in a “stable” operating environment.