Why is the Oil & Gas Industry Mining Data?
Tarun KantsGlobal Technology Solutions Leader for
Engineering Software
QuEST Global Service

With data being the new oil, it's apparent that Oil & Gas (O & G) companies can reap myriad benefits from mining data along-with oil & gas. Real-time access to information can empower them to derive actionable insights from the zettabytes of geo-spatial data such as seismic maps, well logs, equipment parameters, etc. Be it upstream, midstream or downstream- global O & G giants are building on their capabilities thru engineering Data Analytics leveraging both edge and cloud platforms. With benefits ranging from optimizing reservoir throughput to autonomous drilling, predictive maintenance of equipment, Monitoring refining and pipeline infrastructure they are arming themselves with leading-edge technologies such as Edge analytics, Deep Learning, Machine learning to make every drop of crude oil count.

There is an age-old business rhetoric which has stood the test of time. It goes something like 'If you can't measure it, you can't improve it.' Just like the sun rising in the east, this saying has proven to be a universal truth in every sphere of modern life. Be it the Wall Street stock markets or the oil-fields around the world; data has proved to be that omnipresent and omnipotent measurement tool upon which sustainable business continuity depends in the modern world. Moreover, low-cost energy being an imperative for every successfully thriving society, the Oil and Gas (O & G) sector has always been innovating to improve its technology with data while searching and tapping earth's dwindling hydrocarbon reserves.

Illuminating Dark Data
As both the natural crude oil reser ves and the man-made demand for refined petrochemical products mandated a lot of information gathering and number crunching, since inception itself the O & G industry has been drilling and dealing with data to arrive at concrete business insights. Weathering the storm of dynamic economic trends, geopolitical developments and technological innovations, the O & G industry has evolved with multi-faceted and rigorous data practices. With the advent of the digital since the nineties, O & G companies had focused on data acquisition by using sensors rather than periodic human super vision in every phase of the petrochemical exploration, production and distribution process. The data thus captured such as 3D seismic images, machinery performance indicators, oil flow rates, crude pressure reports and many more data points required colossal storage. This was followed by the data integration phase that was characterized by the efforts to centralize the data captured from various sensors into a single console for the geoscientists and engineers to analyze manually. Since the dawn of the millennium, the focus has shifted to software-based acceleration of old workflows alongside the evolution of the 'data scientist' who augments automation by deriving actionable insights from the tons of 'Dark data' captured for so long without much of an analysis. Each unanalyzed seismic dataset often ran into hundreds of gigabytes and evolved into terabytes after the processing algorithms had analyzed them.

Deriving actionable insights
'Easy oil' is gone. Whatever remains needs strategic innovation at every phase of petrochemical business operations. Starting from the upstream exploration and production at oil fields, continuing through the midstream sector's distribution network up to the refineries of the downstream, engineering data analytics has become imperative for profitable sustainability.

The three primary areas in which the O & G sector could leverage Engineering analytics and Ar tificial Intelligence (AI) are:
  • i. Integrating large varieties and volumes of data in finding more crude oil with the most suited technology available
  • ii. Contextualizing day-to-day operational data to reduce operational costs in labor, maintenance, and overheads without compromising on operational worker safety
  • iii. Taking into consideration the constants at play, automating several non-critical decision making while reducing environmental impact
Engineering analytics across the O & G lifecycle
In the upstream, several O & G companies armed with seismic imaging technologies, geo-scientific and mechatronic sensors are accelerating digital innovation by replicating the oil field's 'Digital Twin'. Leveraging AI in the form of Deep learning and Machine learning algorithms applied on these digital replicas, oil companies are optimizing drilling activity accordingly by testing the drilling equipment's digital counterpart on the 3D seismic map created and run on simulation, rather than spending millions in drilling down to unviable sources. For example, data analytics could empower operators to compare real-time downhole drilling data with production data of adjacent wells to adjust drilling strategy, while also forecasting downhole tool failures and instantly customizing parameters based on real-time per formance compared with past performance or events. Moreover, the lift systems strewn with sensors that previously logged equipment parameters can now perceive and stall imminent failures with predictive and prescriptive maintenance capabilities augmented through advanced analytic algorithms. This is how engineering analytics is helping increase reservoir throughput by extending equipment life and lowering the capital and high-risk intensive nature of the O & G production process.

In the midstream, the pipeline network leading to the refineries no longer need to remain at the mercy of periodic manual checks. The sensors inside these pipes which only measured oil pressure, speed and temperature earlier, can now be upgraded to interpolate the corrosion coefficient of the pipeline while also factoring in satellite geo-imaging for structural integrity monitoring. Such advancements in the pre - emptive maintenance of the distribution network would prevent not only leakages but also augment new -age solutions such as drone monitoring and central reporting of any support needed for quick and effective repairs.

In the downstream, the refinery and the equipment are being retrofitted with various sensors and laser scanned in parts to stitch together the refinery's three -dimensional representation that can serve as the 'Digital Twin', just like the digital oilfield. This 3D refinery map also augments health monitoring capabilities and predictability for the refinery equipment that was previously quite disjointed and depended on manual supervision. Moreover , to improve worker productivity and safety, there is an increasing richness in immersive training content delivered through 'Augmented Reality(AR).' Apart from such technical aids for training, routine maintenance to-do's are derived by engineering analytics and operationalized through smart service apps on the field-ser vice operator's handheld devices or AR glasses such as Microsoft HoloLens or Google Lens, while instructing them exactly what to do and where.

The Cloud or the Edge
In order to gain enterprise-level predictability most of the companies are storing and processing the data collected on the cloud and comparing the results on region-wide, site-wide, enterprise-wide scale to identify the reasons why some assets are performing better than others. This is how they are optimizing the entire business value-chain. All such data collection and integration mandates superior data connectivity and processing power. While earlier this connection and processing was solely based on Cloud technology, the multiplicity and complexity of the current data sets have necessitated communication interfaces with higher speed, processing power, and dependability. Hence, many O & G Original Equipment Manufacturers (OEMs) are leveraging cloud only for development as the supervisor while deploying the end-processing programs on Edge Analytics as the executor. In order to supercharge edge analytics, many Oil Field Service(OFS) companies are coming forward to partner with end-to-end engineering service providers who have already earned the trust of industry leaders like NVidia for their GPU -based development and deployment of Deep learning algorithms.

Challenges and the way forward
A single smart oil well may generate up to 10 terabytes of data daily. With the standard O & G technology architecture comprising an ETL(extract, translate and load) module coupled with a MDM(Master Data Management) system, a data warehouse, and an analytics tool, such vast quantities often remain under-analyzed for insights. Moreover, with several data transfer requirements between incompatible applications, data is often manually moved or converted through a human or digital intermediary. Such steps increase time, cost and the chances of erroneous analysis by limiting the identification and correction of wrong data sets.

However, the primary challenge that's still decelerating the data analytics implementation in the O & G sector is the fragmentation and non -standardization of data collected by various proprietary technology and tools. Additionally, the problem also lies in the distributed data sets across the humongous O & G supply chain. Though, engineering analytics is the key to unlocking the actual value of digital transformation in the O & G sector, analytics alone is a not magic bullet capable of solving all the business and technical challenges at one go. Hence, with such distributed databases, blockchain could serve as a transparent ledger system where all these nodes will communicate with each other in an open yet secure way to enable efficient reuse of data with confidence in the system throughout the value chain. Likewise, investment in engineering analytics should be viewed as a business-led exercise, rather than a technology-led initiative. O & G leaders should be cautious about the fact that there are no ideal templates to follow in their quest to leverage data for automating lowimpact actionable insights. Rather than trying to digitize everything at a go, there should be a step-wise approach to digitization for data that should be tied to outcomes and business optimization. This includes a number of prerequisite steps such as installing sensors, leveraging existing information in historians, digitizing assets specially brownfield facilities with legacy designs using laser scans, digitizing maintenance procedures, building data infrastructure for harnessing enterprise data, asset monitoring & diagnostics strategies at asset/plant level and then linking the resultant data architecture to a customized and layered engineering analytic solution for impactful decision making. Nonetheless, insights and the resultant decisions can seldom transform the enterprise, unless they are efficiently backed up by robust technology architecture that can execute the decisions and proactively engage the employees through smart apps.