The Parallels Fastlane Architecture provides unique Direct Assignment technology in the virtual environment, enabling each virtual machine access to a specific dedicated graphics card and network interface card and allowing graphic-intensive applications to run natively at full speed and capabilities.
When talk turns to “virtualization,” even though it is a big part of today’s information technology (IT) landscape, oil and gas industry geoscientists and engineers might be tempted to say the discussion is no concern of theirs, but rather the sole purview of the IT department, i.e., “them.”

Yet Russ Sagert, North American technical adviser for Schlumberger Information Solutions (SIS), says work the group has done with Intel, Hewlett Packard (HP), NVIDIA, and Parallels Software around development of the first end-to-end parallel architecture workstation for improved cross-platform virtual machine (VM) performance is really a story about optimizing workflows for better oilfield decision-making.

“In the oil and gas industry,” Sagert said, “because of the need to interpret, visualize, and perform uncertainty analysis on huge terabyte datasets, our software has always pushed the limits of hardware performance — computation speed, I/O, RAM, cache, and graphics. Historically, SIS software teams have worked closely with hardware vendors to represent these unique requirements, which are even more demanding in a virtual environment.”

Virtualization separates an operating system from underlying resources, and a virtual machine allows timesharing a computer among several single-tasking operating systems (OS). Thus, virtualization is an accepted means for combining on a single workstation applications that run on different operating systems. Yet for high-end applications, virtualization presents its own set of challenges.

That means that, until recently, the oil and gas industry — due to circumstances related to the evolution of computing within siloed functions — had to choose between two equally unsavory options: 1) not having the workstation performance or integration needed for optimal workflows, or 2) spending an enormous amount of money to get what was wanted.

As a result of the cross-vendor collaboration, SIS can now offer users the new HP Z-Series workstation, capable of running two separate operating systems concurrently, each running high-end petrotechnical applications — e.g., Petrel, for seismic-to-simulation modeling, on 64-bit Windows, and GeoFrame reservoir characterization software, on Linux — at near-native dedicated performance.

This kind of performance optimizes integration between the two applications, allowing for iterative real-time workflows that speed complex decision-making. And that’s something that should make geoscientists and reservoir engineers take notice.

The back story

According to Sagert, the geophysicists were the first to drive truly large-scale applications in oil and gas, followed by the reservoir engineers. “Given the performance constraints and functional organization, the constellation of software applications tended to develop in isolation,” he said.

As mentioned, the more recent trend has been to move away from serial processes toward a more collaborative, iterative approach based on best-in-class workflows.

“An example,” Sagert said, “might be work done to establish the range of possible production outcomes based on variability of the subsurface in an oil reservoir or field. Because of the many decisions and assumptions that need to be made in this type of an evaluation, there is never a single derived answer but a whole range, from the most pessimistic to the most optimistic.”

As a result, today the various domain experts are working to build literally thousands of possible “realizations.”
“It also means you’re moving away from point-specific applications toward integrated real-time risk and uncertainty applications,” Sagert said.
At the same time, to derive cost savings, companies have been moving from UNIX and Linux servers and workstations to 64-bit Microsoft boxes that also have the advantage of being on the same platform as the company’s business systems. Given a substantial installed base, the upshot is that UNIX, Linux, and Microsoft operating systems will need to co-exist for at least another five years.
“With the huge investment in project data and staff training the oil and gas companies have made,” Sagert said, “next-generation platforms haven’t replaced heritage platforms as quickly
as might have been expected.”
To reduce costs, IT departments turn to virtualization as a means to allow a single workstation to run multiple operating systems.
“One reason virtualization is popular is because it allows working cross-platform,” Sagert said. “But it becomes problematic as you build workflows because only applications running on the host operating system deliver native performance. The other OS applications, running in the VM guest OS, run with compromised performance, degrading workflows.”
For example, in one instance of a geophysical application running in a virtualized environment, a frame-refresh could be attained only every 30 to 40 seconds. The challenge for the user was to find a single heterogeneous workstation solution that delivered needed interactive performance across both operating systems.
Steps taken
According to Sagert, the solution’s origins date back several years to Intel’s and NVIDIA’s realization that the oil and gas industry was testing the computational limits of their technologies and would need a VM environment that wouldn’t penalize users. They examined what kind of hardware, chipsets, graphics, and components would be needed and what kind of scalability considerations needed to be taken into account. Hardware provider HP and Parallels Software, a virtualization and automation software company, were brought in to help.
The next-generation HP Z800 workstation, equipped with Intel chip and motherboard architecture, NVIDIA graphics processors, and Parallels workstation virtualization software, enabled that same application — the one that could frame-refresh only once every 30 seconds — to refresh at 30 frames per second. “This was a thousand-fold increase in performance, keeping you in the real-time paradigm,” Sagert said. “It allows for greater use of the virtual machine environment and for companies to migrate to best-in-class applications at the pace that suits them.”
That’s when Schlumberger, as a premier industry service provider, came in to test the solution on geoscience applications data. “They told us they had a solution to the problem and asked us to road-test it,” Sagert said. “We’ve done enough work with this now to know it really works. We have the field organization, we know how to go to market, and we have resellers in place.”
Intel Virtualization Technology for Directed I/O (Intel VT-d) technology is the key because it allows the virtualization software to call any device, whatever it might be, directly through the virtualized environment.
The HP Z800 workstation has parallel component architecture throughout to ensure each application on its own OS is not bottlenecked in any way. The Intel motherboard provides for dual 16 PCI Express I/O lanes. The associated Intel Xeon 5500 series (Nehalem) CPUs have a significantly larger cache, and the QuickPath architecture provides for much greater bandwidth and lower latency than the previous Front Side Bus architecture to push accessed data through faster than previous chipsets.
Throughput to the graphics cards also is parallel. Sagert said Schlumberger advocates two NVIDIA Quadro FX 5800 graphics cards as the optimal configuration. “This balanced hardware subsystem ensures that even if both applications are pulling data for computations, resulting in graphics generation, no one component will bottleneck or restrict the next component,” he said.
Parallels’ Workstation Extreme software provides the intelligence to effectively load-balance among multiple operating systems on the same workstation, enabling end-users to experience dedicated graphic and networking resources in a virtual environment.
Sagert concluded by noting that while big companies are making
these moves for efficiencies, the
solution is equally appropriate for
midsize companies and engineering firms seeking effective solutions.
“They don’t have to take a back seat,” Sagert said. ”They can compete based on technology.”