Oilfield Technology - December 2014 - page 26

24 |
Oilfield Technology
December
2014
tasks faster – for example, data analytics and visualisation of hundreds or
thousands of wellswith log values, more interactively andmore iteratively.
This requires real time generation of data, driven by the end users’ workflows
on demand.
Oil and gas companies basically have two options: Wait for software
vendors to reinvent their commercial products, thus relying on the ISV to
innovate and to develop functionality capable of supporting the oil company’s
current and future needs, or they can lead theway themselves. Formany oil
companies, off‑the‑shelf solutions are insufficient to address their demands
anyway. Proprietary applicationsmay be developed completely in‑house by
the oil company itself, but this often results in a lack of integrationwith other
software packages necessary to get the job done. Choosing the right strategic
technology partner to collaboratively develop solutions instead enables early
uptake of new technology that addresses the oil company’s challengeswhile
relying on partner expertise to deal with technical issues like integration.
Whenmaking decisions regarding software infrastructure, a critical
question iswhich technologies should be included for the next‑generation
solutions. Which directionwill a company choose to tackle the demands of
tomorrow: the ‘tried and tested’ approach that has been around for 20 years
already, or new technologies thatmight provide tremendous benefit and have
a greater impact?
Data‑driven interpretationandanalysis
Having access to fast, interactive visualisation tools is critical in aworldwhere
diverse and increasingly larger data types are the norm.
An effective approach to the data interaction challenge, as pioneered
by Hue, aNorwegian software technology company, is for visualisation to
drive interaction and compute, with a unified processing pipeline for both.
Whenever a user interactswith the data, it is ideal if the software automatically,
and in real time, re‑computes and re‑visualises only the data necessary from
the user interaction. This type of real time analysis and interactionmakes it
possible for an interpreter to continuously tweak attribute values and view the
updated results immediately to determine progress, ultimately enabling an
end‑to‑end interactive processwithmaximumaccuracy.
As an example, Roxar turned toHue in recent years for GPU‑accelerated
seismic attributes. The company used these attributes to take itsmodelling
workflows to anew level, including fast andaccurate visualisationof
terabyte‑sized seismic data sets. Roxar experienceda 15 ‑ 20 x speed‑up
compared to its primary competitor, providing significant benefiats to its clients.
A few leading‑edge companies are at the forefront, seizing this rare
opportunity to quickly leverage newcomputing technology and truly innovate
in solutions they deliver to the E&P industry. However, many companies that
develop exploration and production software are failing to capitalise on this
technological wave by relying on old technologies they believe are ‘good
enough,’ when in fact they are scarcely capable of running today’s applications,
let alone the demands of tomorrow.
UsingadvancedworkstationhardwaretoachieveGPU‑accelerated
applicationperformance
Computing technology has long been a vital, enabling part of the exploration
and productionworkflow. However, the industry’s continuedmove towards
full realisation of a shared earthmodel ratchets up the demands of
this technology, with a demand for faster access to ever‑larger data
sets and an urgent need to rapidly compute and visualise, to support
more iterations and better analysis.
Most companies look at GPUs ormultiple cores as away to
accelerate their applications. Performance improvement in visual
computing looks at theworkflows – at the integration between
what the user is trying to accomplish at every single point in
time – and orchestrates high‑performance data streaming and
on‑the‑fly computing and visualisation to achieve greater workflow
performance.
However, with the enormous amount of data required to perform
this performance improvement, this puts a significant drain on
computing resources, evenwithGPUs handling a significant amount of the
workload. As a result, workstation configuration and resource utilisation are
critical to achieving the needed performance improvements.
Recently a teamof engineers fromLenovo, Magma, NVIDIA andHue
configured a solution that would enable the processing of massive amounts
of seismic data needed to performcomplex visualisationworkflows in the
field. The solution uses the Lenovo ThinkStation P900workstation, a high‑end
workstationwith a specially enabledBIOS that can initialisemultipleNVIDIA®
Tesla® GPUaccelerators. Coupled to the P900 is the ExpressBox 3600 from
Magma, a high speed expansion systemcapable of providing the needed
environment for running up to nineNVIDIAGPUs. The ExpressBox 3600
expansion systemsupports fault tolerant operation by providing redundant
power supplies and cooling, aswell as a systemmanagement interface for
remote access.
Working with Hue to optimise application performance, the team
managed to support the rendering of terabytes and even petabytes of
data in seconds, mitigating key challenges in computation acceleration
such as data decomposition and GPU or systemmemory limitations.
The right configuration of hardware components can help to handle the
data decomposition among the computational units and orchestrate
the overall GPU and CPUmemories to be utilised as a shared resource
for both algorithms and visualisation. Such an approach allows for less
computational code to be generated, resulting in less code that needs to be
maintained, while at the same time not compromising on performance.
The teamhas seen tangible workflow accelerations in the 80 ‑ 200 x
range compared to traditional approaches, and withmuchmore
manageable code, freeing up resources to domore computational
acceleration overall. Oil and gas companies have accelerated visualisation
work tasks frommany hours to seconds with only a few days of coding.
Conclusion
To achieve the level of performance the industry requires, integration
of visualisation, compute and large datamanagement in a common
environment is essential, combined with the ability to intelligently fetch
data and apply parallel computation based on user interaction. The need
for near real time decisionmaking around a tera‑ and peta‑scale of data on
any device or virtual computer is available to the exploration and production
industry today.
Many companies leave it up to others to take the initial risk of applying
new technology configurations and take a role as late adopters. Companies
may lack the programming skills, agility, or perhaps the vision to quickly use
these new computer technologies, whose growth and uptake will continue
to accelerate, leaving companies who cannot quickly figure out how to use
themeven further behind.
Applications based on outdated architecture are unable to leverage
new technologies and compete with those that have already identified
opportunities in the new visual computing world. In order to tackle future
challenges, there is a need to rethink current technologies andmethods. In
the new visual computing world the winners will be those with the engineers
who are able to leverage available technologies to improve interactivity,
achieve newworkflows and improve productivity.
Figure 2.
Analysis of unstructuredgrids fromreservoir simulation results.
1...,16,17,18,19,20,21,22,23,24,25 27,28,29,30,31,32,33,34,35,36,...76
Powered by FlippingBook