You’ve reached your limit!

To continue enjoying Utility Week Innovate, brought to you in association with Utility Week Live or gain unlimited Utility Week site access choose the option that applies to you below:

Register to access Utility Week Innovate

  • Get the latest insight on frontline business challenges
  • Receive specialist sector newsletters to keep you informed
  • Access our Utility Week Innovate content for free
  • Join us in bringing collaborative innovation to life at Utility Week Live

Login Register

Data-driven maintenance and asset management will improve both the customer experience and long-term sustainability. But collecting and managing data comes with costs that can be hard to swallow. Panelists at Utility Week Live Summit, and participants in the workshop session, looked at different use cases and the best way to build a business case.

For utility companies, decisions around maintenance, capital expenditure and the longer term life cycle of their assets are increasingly fused with future-gazing on net zero and climate change. That means that utilities are increasingly adopting “value-based” decision making – assessing both the financial and carbon cost of interventions, and planning for future resilience and reliability rather than “quick fixes”.

But a focus on the longer-term lifecycle of assets has to be supported on a secure foundation of accurate and up-to-date data. Ensuring a plentiful flow of data, then turning it into actionable operational insights, creates a whole new set of challenges for utility companies.

Those issues were explored at the Utility Week Live Summit panel debate on Managing, Maintaining and Planning Assets, sponsored by Capita, and the workshop session for delegates that followed it.

The debate asked, what data should be gathered in the first place? Once collected, what quality control tests does it need? How skilled is the organisation at incorporating it into operational decision making? And once you have built a knowledge base, how do you share data with other stakeholders?

Data flows, but not of its own accord
While data is a valued tool across a wide range of operational agendas, the need to collect, manage and disseminate it creates an operational agenda of its own.

For instance, asked how long it took to build a comprehensive picture of operational and embedded carbon emissions, Emma Ford, head of gas construction in the capital delivery team at National Grid, revealed that it took five years’ focused effort from the sustainability team.

In response, Colin Beaney, global vice president for asset-intensive and energy and utilities at IFS, the enterprise software company, commented that utility companies face a difficult path when trying to accurately measure asset conditions and carbon emissions.

“If we try to quantify [the picture] more accurately, it becomes a huge data challenge; we’ve got to look at condition, remaining life, maintenance history, replacement cost, the carbon footprint. It’s a hugely complex challenge, and easily understood why it can take five plus years to get a picture.”

Combining cost-saving and sustainability
In the main panel discussion, Northumbrian Water asset director Tamsin Lishman argued that the key to integrating carbon and financial agendas lies in projects’ inception stage, where there is an opportunity to assemble a coalition of decision-makers who are more likely to reach sound, sustainable solutions.

She cited a case study where the original plan to deal with a combined sewer overflow was building a large concrete storage vessel close to the discharge. However, harnessing the power of group-think resulted in a “systems” approach that addressed the problem upstream, separating the rainwater and sewage, and using the rainwater to reinvigorate a swale in a park.

“Taking rainwater out of the system allowed us to build a smaller tank, creating a nature based solution and – as an added benefit – it was a lower cost solution,” she concluded.

Ford described a daisy-chain of interlinked efficiencies once carbon takes a lead in decision making. National Grid Gas, she explains, assesses proposals against a “carbon hierarchy” – first considering building nothing, then building less, building clever and, finally, building efficiently.

“In terms of  a new substation, what we’ve done is consider the reduction of roads on site. Do we need as much accessibility to the project? Through the build less approach, can we utilise kit that already exists on other sites? And can we also reuse equipment that already exists? It basically gives a different lens to managing asset health.”

Feeding the machine
Meanwhile, Beaney stressed that utilities seeking to cut costs and boost efficiency should see machine learning software as their ally. By analysing data on operational issues – from customer call centres, IoT sensors, smart meters or other connected devices – software can determine how different conditions are linked, yielding insights that can then be projected forwards. For instance, what typically precedes a burst main? What do customers do in a heatwave? That approach can focus resources on the actual problems rather than the false positives or ‘it can waits’.

In another example, data crunching can help to ensure that a customer’s issues can be resolved in a single visit. “We can combine all the knowledge around the fault as reported, run through the historical views of how we’ve solved it in the past, and ensure that the engineer or technician we send out has the right skills, the right documentation and the right procedural repair scenario.”

Another way to squeeze more efficiencies out of maintenance and operational budgets is remote assistance: giving field engineers access to a technical expert located anywhere in the world, who can coach them through a task using wi-fi and a smartphone or tablet.

“Why are we doing this?” Beaney asks. “Well, assets are getting more complex, and the workforce needs to be upskilled and reskilled. It gives us the opportunity to leverage global experts and use them more locally, and it allows us to immediately capture and feed in diagnostics and reduce the field calls. And again, helps us to increase first time fixed rates.”

The cost of gathering data in the first place
But there was acknowledgement that data gathering, for instance from drone surveys, GIS, sensors and field engineers, comes at considerable technology and staffing costs: from maintaining battery operated devices to hiring teams of quantitative data analysts.

In the workshop,  Digital Catapult’s Niko Louvranos, commercial lead for 5G, emphasised that utilities aiming to obtain more accurate and better quality data need to look at the business case holistically.

“On the cost side, it’s not just the cost of IoT devices, such as sensors and loggers, and the software platforms. They have to take into account the cost of civil works, the digging up and re-surfacing of roads and the traffic management, to enable the connectivity between the devices on the field and their central data systems.

“And civil work costs are substantial, which will limit the amount of devices that can be installed and connected cost-effectively. And it’s therefore always a question of ‘how much do I need to invest to get to a minimum viable volume and quality of data?’ And that boils down to a viable ‘cost per connected asset’,” he summarises.

But they also need to quantify the potential upside of the business case. “Utility companies need to know more about the benefits of potential use cases for the data they’re collecting,” Louvranos says. “And that needs to be done holistically across an organisation, not in departmental silos. And obviously across the industry, although that’s another topic.

“At the moment, internal silos often make it hard to see the benefits, in comparison to the investment needed, that can be derived from an initiative if only a single part of the organisation needs to fund it. Helping companies expose the company-wide benefits makes the cost less daunting and palatable.”

Heading in the right direction
Meanwhile, workshop participant David Fortune, vice president of innovation at software company Innovyze, points out that the costs of installing sensors and monitors is “on a downward curve”. And he agrees with Louvranos that costs must, in any case, be assessed against benefits, and that these are multiplying – thanks, in part, to machine learning and AI-enabled software and predictive analytics that Beaney also referred to.

Fortune also stressed the value of combining the installation of data loggers for continuous monitoring with routine maintenance or inspection works. “You’ve got to take the opportunity of installing reasonably cheap metering while you’re out there doing things on the network.”

At Northumbrian Water, Lishman stressed that “data gathering” can be quite informal, and low cost. Describing a “data collecting app” created by Northumbrian’s GIS team, she says that “people out doing jobs can collect high quality information, using available technology on your mobile phone, that then avoids the need for someone to come back to the office and mark up a drawing.”

And – in a pitch to colleagues in other utilities – Lishman also suggested that utility companies should adopt a pooled approach to data gathering. “If we’re in a street works and doing something we’ve got collective, whether it’s gas, water, electricity, cable, we have to get that common view of our assets and condition. We are not in competition with each other.”

A third approach lies in implementing new innovative techniques, such as photometry, with Northumbrian trialing Matterport, a platform for creating and sharing 3D digital models of assets.

Factoring in the human element
But despite all the talk of IOT sensors, apps and data sharing, data collection still involves humans, and typically involves adding a task to a work list. AnneLouise McGinn, asset management lead at Portsmouth Water, told fellow workshop participants of her first-hand experience of gathering “opportunistic asset information” in the context of a heavy “business as usual” demands.

“When the guys are digging down, to repair or replace a main, they can give me all the bedding information, on the soils and the temperature – information that could be really useful for me down the track.”

But where she is unable to identify a pressing need or use case for the data, there can be resistance. “When people are under constraints and not able to fulfill their proactive maintenance schedules and I’m throwing in another couple of things to measure, they’re not going to enjoy that.”

Fortune acknowledges the problem: “When you’re out on site, you don’t see the value of individual bits of data, because on their own, they don’t have any value. It’s when they’re connected up and the whole is put together that the value’s there. So how do you demonstrate that collective value to  people on site, so they can see they’re contributing towards something that is, for the company, extremely valuable?”

McGinn agrees that this is the goal. “When I was a consultant, I would often sit in on toolbox meetings, and we would show them what their data had meant and they felt a part of the end product – that’s quite effective.”

But the alternative – pursuing a strategy of fully automated data collection and analysis – often raises professional insecurities around job displacement, as Louvranos comments. And while people appreciate the argument that technology creates new jobs as fast as it makes others redundant, staff tend to be emotionally attached to the job they have, rather than a potential future role.

“Because there’s a real resistance to change to automation, because automation means less labour, usually because people have a vested interest to have a job and the same job,” he says, saying that not everyone is “okay to go towards the automation journey at the expense of people’s jobs.”

Again, that problem can be addressed if staff can readily see the benefits that flow from having better data. Instead, as McGinn notes, data is too often siloed, or lacks interoperability, or sits in platforms for specialists when it should be accessible at all levels.

“We have these massive pools of big data and only a select few people understand what to do with it. It’s about making big data, and the understanding of what that’s used for, accessible to everybody, and making data digestible and user friendly, for people of all skill sets.

“And just making it easy, and not having to train people up every two minutes and refresher courses every six months, when people simply don’t have the time to do that.”

Operational difficulties around data gathering
But costs and human resources aren’t the only barrier – utility companies need to get better at using and operationalising the data they do hold or have access to, as Beaney acknowledged.

Taking the example of data gathered by drone surveys, he says: “Historically, it’s been a challenge to geo-locate the anomaly the drone identifies, then how do we roll that back into the enterprise asset management solution? Or optimize the engineers’ scheduling to go out and fix the issue?”

And while it’s relatively easily to assemble the finance, teams and motivation for a pilot study, it’s often difficult to implement a promising project across an organisation and its workflows.

In Beaney’s words: “We’ve been involved in a number of projects and proofs of concept around the globe, promoting this type of approach [AI and machine learning], but often those projects and pilots don’t make it into production, and why is that?” he days, suggesting that projects often stall because “revenue and the benefits are often not clearly defined”.

McGinn agreed. “Being a small water company, we do quite a lot of pilot studies with new technologies, we do a lot of acoustics, and transient monitoring. The implementation of that across our whole system is obviously where we want to go, but we just don’t have the headcount.”

Fortune at Innovyse also felt that utility companies struggle to exploit their data, or have relatively weak data architecture. “Is the data actually utilized? I’m challenged to say that it probably isn’t. [Utilities] have got multiple SCADA systems, multiple asset information systems dotted around, none of which are talking to each other. Collating that information is relatively inexpensive to do. But if you’ve got to go out and survey the network, then that is hugely expensive.”

For utility companies, gaining a better understanding of asset health – in order to prioritise investment needs for the next regulatory cycle, and beyond – is clearly an operational priority. It’s also clear that intelligent, effective and data-informed asset management can help in decarbonisation, by cutting out the extra journeys, redundant capacity or short term thinking. Accurate, accessible data is the common denominator – and one where utility companies will surely see pay backs on their investment.

This article is part of UW Innovate’s coverage of the managing, maintaining and planning assets challenge. See it brought to life at Utility Week Live, 17-18 May 2022, NEC Birmingham