Hart Energy IndustryVoice™ allows marketers to reach our audiences by enabling them to create and place relevant content in our media channels—in print, online, via social media and at live events. Each IndustryVoice™ piece is produced by the marketing sponsor and any opinions expressed by IndustryVoice™ contributors are their own. For questions about IndustryVoice™ programs, email IndustryVoice@hartenergy.com
___________________________________________________________________________________________________________________________________________________
When PGS launched the Triton survey in November 2013, the company knew it would end up with the largest seismic survey it had ever collected. When the company finished acquiring the 660-terabyte dataset in August 2014, it had the most complex data processing and challenge it had ever faced.
Seismic imaging is a highly complex task demanding the production of accurate, clear images of varied geology at up to a depth of 16 kilometers from data with large velocity and density contrasts and incomplete boundary conditions. To process this data into an image, commensurately complex algorithms have evolved—at the same time, a tightening market means companies like PGS must collect data and produce images “faster, cheaper, better” by improving quality and increasing productivity with more automation.
Together, these scientific and business needs are stressing compute infrastructures across the seismic industry. PGS and similar companies have typically used clusters for high performance computing (HPC) needs. However, this new era of massive data volume and the overall tightening of margins led PGS to the realization that the company couldn’t meet any of these demands with its existing compute technology. A “radically different” solution was needed.
For decades, exploration seismology has been conducted in a very repetitive, linear process manner—one well-suited to cluster computing. But as survey complexity increases and data volume grows, the customary single-shot approach no longer produces accurate enough images. In addition, in order to meet business and scientific goals, PGS recognized it needed to shift its R&D approach including using all 3-D operators throughout the processing flow, moving toward full-wavefield and full-survey processing and imaging, and leveraging angle-domain image space for analysis, residual corrections and conditions.
These challenges are better suited to supercomputing, but that supercomputing solution needed to be cost effective for PGS’ current applications, fit its technology pipeline, enable implementation (at scale) of new algorithms, reduce development time and keep up as volume and complexity continue to increase. So PGS decided to switch from clusters to a Cray XC40 supercomputer. With the new system, PGS went from being unable to process the Triton survey within its production deadline to industry-leading RTM processing capabilities at ~129M traces/min. The company can now run more, larger jobs using more complex data and algorithms and run individual jobs faster with higher-quality results. Read the full case study for details.
Recommended Reading
Talos Energy Expands Leadership Team After $1.29B QuarterNorth Deal
2024-04-25 - Talos Energy President and CEO Tim Duncan said the company has expanded its leadership team as the company integrates its QuarterNorth Energy acquisition.
Energy Transfer Ups Quarterly Cash Distribution
2024-04-25 - Energy Transfer will increase its dividend by about 3%.
ProPetro Ups Share Repurchases by $100MM
2024-04-25 - ProPetro Holding Corp. is increasing its share repurchase program to a total of $200 million of common shares.
Baker Hughes Hikes Quarterly Dividend
2024-04-25 - Baker Hughes Co. increased its quarterly dividend by 11% year-over-year.
Weatherford M&A Efforts Focused on Integration, Not Scale
2024-04-25 - Services company Weatherford International executives are focused on making deals that, regardless of size or scale, can be integrated into the business, President and CEO Girish Saligram said.