OGUK Data and Digital Day – D3 Review
25th February 2020, Aberdeen
Gareth Smith
Head of Consulting, Sword Venture
After navigating the burly doormen hired to keep the protestors out (there were none), the day kicked off with an overview by David Lecore of the work the OGA has been doing around executing its digital strategy. The focus is on data accessibility; the OGA has made great strides in opening up access, borne out by the user stats: 189k visitors to their ‘energy platform’ in 2019, > 5900 registered users of the NDR, > 1M page views on the OGA Digital Platform. David was also open about the challenges, especially around data completeness and quality. DISKOS is the benchmark here; currently 6 PB of data and growing, vs 326 TB in the UK NDR. There’s also a wide gap in quality, with far more machine-readable data in DISKOS. These are the challenges that ‘NDR 2.0’ will need to address. More on that below.
Maja Kildedal from Equinor talked about their wide-ranging digitalisation. They see themselves as a ‘digital energy company’, and back this up with a defined digital strategy and program of work to make this real. This includes the Omnia cloud-based data platform, providing support across the oilfield lifecycle. They’re using the Mariner asset to pioneer first use technologies (more than 30!) and showcase the digital oilfield, including a full digital twin (Echo) supporting a wide range of operations. The other key component of the Equinor strategy involves embracing radical innovation, something the Norwegians seems to be particularly good at.
Next up was Chris Frost form Sword DataCo, with a detailed review of the work done to support the OGTC NNS missed pay project. DataCo’s remit was to prepare the data for machine learning. It turned out the stakeholder expectations at the start of the project around data availability and quality were somewhat wide of the mark, especially for data from the UK NDR. However, DataCo managed to deliver a high-quality data set using various ‘smart’ data management approaches, leveraging well defined processes and machine learning to accelerate the process. The whole project flagged up the gap between the UK and Norway NDRs in terms of quality and completeness, but also demonstrated that it’s not an insoluble problem. It is intended that the cleaned-up data also makes its way back to the NDR – good news for everyone. It also highlighted the need for good quality data for machine learning – there’s no way around that one!
Adrien Bissett from Belmont Technologies gave us a fascinating insight into the future of deriving knowledge and insight from data using AI and machine learning. Their Sandy ‘Smart Assistant’ product, developed with funding from BP, ingests data into a knowledge graph and gives it context using ‘known facts’ and proven, physics-based concepts as well as the data. The result is that this collection of data and knowledge can be queried using natural language, with far more relevant and accurate results (<2% false positives). The physics-based rules counter the biases inherent in many data sets, provide more balance. They are working on integrating seismic and well data into the framework, providing an even richer knowledge base. Adrien also touched on the work they’ve done around massively accelerating the geological modelling process using AI, allowing a huge increase in volumes of models run and decrease in computing footprint. It’s surprising how energy intensive high-end computing can be, so this is a welcome development.
Back to more practical matters with Stuart Beatty from Kestrel. He made an impassioned plea for the industry to collaborate on the decommissioning of the mountains of physical data still stored in warehouses and salt mines. Much of this is unnecessarily duplicated between companies, or potentially unentitled data that should not be there anyway! Stuart argued that if industry leverages its joint bargaining power, this should reduce the cost barriers for the exercise. It was a bold pitch, given a chunk of Kestrel’s business comes through data storage, but surely makes some sense.
The next set of talks took us in to the world of Process Engineering, Ops and Capital Projects. These are not my core areas, but it’s always interesting to see what’s happening in those spaces.
The key point made by Steve Aitken from Intelligent Plant was that we should focus on the 80% solution when developing oilfield apps but ensure that this requires little or no user effort to learn and use. Way better than aiming for a complex 100% solution that has barriers to use.
Esther Diederen, Spirit Energy and Luigi Grossi, AG Consulting presented a practical example of process automation, replacing manual cut ‘n paste in the SAP environment with a robotic alternative, using UIPath tech. It demonstrated that not every digitalisation project needs to be on a grand scale to provide benefit.
Alex Woods from Aveva looked at digital twins and process automation, linking data to apps to visualisation to improve operational efficiency. As a case study, he showed an example where they built a 3D model of a gas processing facility in Norway in less than 6 weeks. This was used to plan inspections and move away from time consuming paper-based processes.
Matt Pybus from CyberHawk provided an overview of their iHAWK product, ‘next generation visualisation’ software for mapping and managing the construction and commissioning of assets. This type of tool is now commonplace in the upstream industry, so it was interesting to see how this translates to the engineering world.
Julian Pereira (TrendMiner) and Matthew Cole (Chrysaor) discussed their use of TrendMiner’s analytics tools for process monitoring and predictive analytics, using realtime process data streams and smart pattern matching to look for sensor anomalies. The idea is to break down the barriers between the engineering community and the data scientists and put predictive analytics into the hands of the many, not just the few. This is likely to be a trend we see accelerate over the next couple of years.
In the final slot Adrian Purdy (Cantium) and Colin Frost (Energective), looked at how Cantium formed from the acquisition of Chevron GoM assets (dating back to the 1940s!) They define themselves as a data and technology company that happens to produce oil & gas. Their CEO was at ExxonMobil but asserted a very strong digital and technology bias, with everything in the cloud and all information available to everyone in the business. They doubled production to 21k boepd and needed fewer staff to do so. Energective helped them get all that done.
All in all, an eclectic mix covering a wide range of the E&P value chain. This event felt a bit more grounded than some of the other events I’ve been to recently. It’s good to see more of a focus on actual digitalisation case studies vs concepts. Thanks to Dan Brown and his team at CDA for pulling this together.
Just as a closing thought, I think what we don’t hear enough of are the challenges and what doesn’t work. You can often learn more from these than the stuff that’s worked. Perhaps there’s room for an event that digs into those initiatives that have failed, as well as the success stories?
*Thanks to Neale Stidolph for his review of the last session – I was on my way to the airport!