Your account already exists. Please login first to continue managing your settings.
Information technology companies are stepping up to meet the challenge of handling large amounts of data, including in remote 3-D visualization environments.
When it comes to making room for common applications such as email and databases along with highly technical applications such as those involving seismic data and well planning, handling massive amounts of data has presented challenges in the digital oil field.
Many are finding answers and learning ways to virtualize their resources with Cloud technology as IT companies tout advances such as remote 3-D visualization capabilities and associated supportive systems as well as other applications.
The topics were addressed recently during a Hart Energy E&P webinar, the first of three "Big Data and the Cloud” presentations, which brought together the oil and gas industry and IT speakers to discuss tech trends and innovations related to the challenges of complexity and data growth. The first of the three-part series focused on remote 3-D visualization.
“A lot of the components necessary to create an effective 2-D and 3-D visualization environment have existed separately for quite some time now,” said Stuart Lowery, business development manager of data management and infrastructure for Paradigm. He noted the presence of high-end work stations, large amounts of memory, high-end graphics cards, and fast network connections to shared network storage devices. “Sort of the game-changing event is the evolution of GPUs for both visualization and computation.”
That, coupled with flexible storage and storage virtualization, creates the building blocks to create a remote 3-D visualization environment, he said. “The new challenge is that 3-D graphics capabilities are pervasive and now seen as critical to E&P workflows. Systems and software are available to support this remote 3-D visualization.”
For example, there are widely available graphics capabilities on many platforms, powerful traditional desktops, emerging devices for mobile users, and massive compute power in data centers, he said. “Now we’re starting to see hybrid systems with coupled CPs and GPUs and even single chips with both.”
Software is capable of combining data from different repositories from multiple sources and displaying in 3-D on remote desktops and shared by multiple users. That could prove useful when handling large amounts of seismic data and well log images as well as raw data coming from the field. It has even greater importance considering the industry’s mobility.
“The key is to pool the resources so they can be virtualized and tightly coupled in a secure manner,” Lowery said. “The data center of the future includes a shared data repository as reliable storage, compute power, and graphics that can be shared that are in close proximity to the storage network infrastructure that connects those. The real key here is they have to be scalable and flexible in providing a low latency that is required to support remote visualization in an interactive way for the user.”
IT experts also are uncovering workflow trends.
The trend for GPUs in the workflow is moving from the graphics side for interpretation to seismic processing and flow simulation, and efforts are underway to integrate the two with new Cloud technology, said Keith Cockeram, energy business development manager for NVIDIA. He noted that hardware to handle large, expensive data has been secured; however, data sizes are outgrowing screen resolution.
To overcome the challenge, Cockerham said the company has done compression and color space conversions on the hardware itself and increased algorithms for pixel compression, among other steps. The company also has developed software to handle the increasing size of datasets, giving some users their own GPUs and memory footprints while others – such as those who don’t require much graphic power – can share.
As the data volume grows along with the need for collaboration to empower users and handling real-time information, Peter Ferri said the pressure is on IT to get it right. Ferri, energy industry director for NetApp, mentioned operational challenges such as security and time constraints posed by trying to copy large amounts of data to local storage.
Also, “there is a proliferation of siloed infrastructure areas that is making it very difficult for companies to efficiently provision services or to make changes across multiple software architectures with unintegrated management tools and the inability to fully leverage assets which leads to increased costs,” he said. However, the company’s customers are moving from such rigid silos to a more service-oriented infrastructure that allows for adapting to change.
One solution involved working with Cisco on its FlexPod integrated system, which John Thomas, technical solution architect for Cisco, explained. The system’s features include extended memory for faster rendering, bigger datasets and more desktops per server, and low latency among several other enhancements.
Nearly 40% of all large enterprises have virtualized their services, Ferri said.
At Hess Corp., for instance, NetApp has enabled the company to support four times more 3-D seismic interpreters. The company increased efficiency by decreasing backup time, improving data loading performance from 20 minutes to 1 to 2 minutes, and reducing storage capacity needs by 30%, according to Ferri.
“The key enabler is virtualization,” he noted. “It’s an evolutionary process.”
The second part of the Big Data and the Cloud series, “Make more accurate prospect decisions in less time with a hybrid compute environment,” will cover GPU/CPU computing Nov. 1. The last session is set for Feb. 6, covering Big Data Analytics. To sign up, go to https://secure.oilandgasinvestor.com/webinars/?eventid=133&where=E%26P.
Contact the author, Velda Addison, at firstname.lastname@example.org.