While DNV GL’s Energy in Transition report focused on the market, demand and cost pressures that the industry faces, it had some interesting thoughts on the role that technology would play in the upstream sector.

According to the report, the world is approaching a watershed moment as energy demand is set to plateau from 2030, driven by greater efficiency with the wider application of electricity. A rapid decarbonization of the energy supply is underway with renewables set to make up almost half of the energy mix by 2050, although gas will become the biggest single source of energy.

“We have seen impressive and important innovative efforts across the energy industry, resulting in cost saving and efficiency gains,” said Elisabeth Tørstad, CEO, DNV GL–Oil & Gas. “The oil and gas industry must continue on a path of strict cost control to stay relevant. Coming from a tradition of technological achievements, and having the advantage of existing infrastructure and value chains, this industry has the potential to continue to contribute to energy security and shape our energy future.

“Increased digitalization, standardization and remote or autonomous operations will play a central role in achieving long-term cost savings and improving the oil and gas industry’s carbon footprint. We also expect the industry to turn to innovations in facility design, operating models and contracting strategies.”

Benefits Of Digitization

According to Tørstad, digitization will come in many guises and have many benefits to the industry with greater automation and direct cost and manpower reductions. “It’s the opportunities to optimize production in a different way,” she added. “What we see in very detailed technical situations is you can actually use digital information to make the right decisions for your operation.

“You see it very clearly in projects: oil and gas being one of the industries lagging most behind using digital tools through project development. There are huge opportunities for improvement by having all the suppliers and supply chain working together on the same solutions.

“Some of the work that we are doing with Siemens on the PLM [product lifecycle management]system is enabling a much easier flow of information between partners and showing that everybody’s working on the same models, the same information at the same time. Some of these things are things we could, as an industry, have learned before, but we’re lagging behind and we need to take it on board.”

A key tool in digitization is the use of digital twins, a dynamic digital representation of an industrial asset, which enables companies to better understand and predict the performance of their machines, find new revenue streams and change the way their business operates.

“It’s coming with some of the automation, and it is coming in two ways—the project stage and the operational stage,” Tørstad explained. “With projects, a digital twin allows you to work on the quality assurance, the modeling, the testing, before you have the actual physical asset.

“Subsea operations are probably one of the first areas where it’s really useful to have, and relevant and being implemented; it’s a natural use as those processes are already automated already,” she said. “It’s a real work now on not only digital twins, but digital twins where models can work together. Very often you have one twin with software or algorithms working on that twin, now we’re looking at models that are interacting in a different way, so it’s coming.”

Delivering Value From Data

While Tørstad believes that the industry is on a path of increased digitization when it comes to elements of project management, she feels the sector is at a crossroads when it comes to fully adopting the benefits of automation and artificial intelligence.

“It’s more complex when you get to the intuitive elements,” Tørstad said. “We’re replacing people with data when it comes to getting the information, but we’re still making the decisions on a human level. More of that will also be changed going forward where decisions are made automatically. It’s fascinating because even if we know that there’s a lot of human error involved in everything that goes wrong, we still seem to trust people more than machines right now. I think the whole price of oil is driving it; it comes as one of the key opportunities we have to be efficient and make money. When you have prices of more than $100 per barrel it’s not necessary, but now it is.”

One of the key challenges that the industry is wrestling with is how to make the best use of the mountains of data it gathers and how this can be incorporated into better decision making.

“I think it’s a fascinating area because we have data acquisition incorporated into decision making that will come from sensors and live data streams,” she continued. “But we’re still at a stage in the industry, and in particular on the existing operating assets, where we’ve had the data but the data is in a drawer.

Plus, “There’s a lot of bad data around, and it takes quite some effort to release this data. It continues to be trapped in old systems and old ways of gathering and collecting. Some of the work we’re doing with quite a few companies these days is to look at their data assets and see what they have and what’s priority number one. We think there is a lot of value in releasing that data, but it’s a gradual process.

“It’s easy to see that coming into new installations. We are at a time where we have a much better grip than two or three years ago in terms of what kind of systematics, what kind of ontology, what kind of tagging in sensors and which systems need to talk together. But on the existing ones it’s a bit of a risk reward discussion or cost benefit discussion on what to release and what not to release.”