SES Water’s asset data lead Jack Nicol explains how collaboration between both new and existing technology partners and internal teams has underpinned an unprecedented ‘self-learning’ smart water network rollout.

In March, SES Water deployed intelligent technology across its entire water distribution network – which supplies 160 million litres of water every day to over 745,000 people in parts of Surrey, Kent and south London via more than 2,000 miles of water mains – in an effort to cut leakage by 15% over the next three years, and halve it by 2045.

What’s been described as SES’ “self-learning network” flags issues in near real-time in every one of the firm’s 325 district metered areas so rapid action can be taken to ensure that customers continue to receive an uninterrupted supply of safe drinking water.

This technology also promises to slash the number of pollution incidents, unnecessary site visits and lower carbon emissions by targeting where field teams look for leaks.

Speaking in March, SES’ head of asset strategy, Daniel Woodworth, described the network-wide rollout as “a game changing milestone” for both customers and the wider industry.

“We already have one of the lowest levels of leakage in the country, but we want and need to do more and our intelligent network will significantly improve the way we manage leaks and help reduce the total amount of water lost each day,” he said.

“Not only will it minimise interruptions to supply for our customers but it will also help us reduce the amount of water we take from the natural environment.”

Creating a prediction

Speaking to Utility Week Innovate, SES’ asset data lead Jack Nicol claims that thus far no other water company in the UK has 100% of their distribution network covered by sensors gathering minute-by-minute data which is then fed into software offering real-time transparency of network changes.

This data directly informs SES operational teams and speeds up their response time to reduce leaks, bursts and supply interruptions for its customers.

As a result, depending on the size of the leak, sensors could allow teams to inform customers of issues before they notice it themselves.

“The software collects data and will look back and compare that to the last seven weeks and decides based on patterns whether or not to accept or reject the data,” he explains. “Only accepted data is then put into creating a prediction.

“The prediction is therefore constantly being updated to make it as accurate as possible,” Nicol adds. “As soon as the data deviates from the prediction we are notified. If the data then continues on its deviation past the limits we set, an alarm is then activated.”

Recruiting tech partners

Nicol explains that the project is rooted in close collaboration between Vodafone, Royal HaskoningDHV and Technolog.

Vodafone supports the system with its narrow band internet of things service, which is optimised to provide efficient communication, long battery life and lower costs.

On top of this, Technolog, with whom SES upgraded its entire logger stock in early 2020, supplied 730 network sensors at district meter and pressure regulating valve sites to help measure flow, pressure, transients and water temperature.

In the search for AI software, SES held interviews with several tech companies in 2019 after devising its intelligent networks concept, with each company given the same brief and the same amount of time to pitch themselves. Eventually, Dutch engineering consultancy firm Royal HaskoningDHV was selected to bring its data analysis technology to the table.

“They have a software package called Aquasuite which provides all the aspects of an intelligent network that we envisaged,” he says, though its “Burst Find” function – which alerts to the size of a leak and narrows down the leak location using additional pressure sensors – is still in an, albeit “very promising”, extended trial stage.

Nicol adds that technology supplied by Vodafone, Technolog and Royal HaskoningDHV is further supplemented by several operational levels within SES overseen by both a project manager and an asset strategy manager.

“We have two in-house data techs who are responsible for maintaining a reliable stream of one-minute data coming into our systems from the loggers in the network every 15 minutes,” he says. “We also have two data analysts that look and monitor the quality of the data coming back into the various systems we have.

“Our five 24-hour control room operators have all been trained in identify and classifying network alarms and depending on the severity and classification of the alarm they can then deploy our 12 field inspectors or eight leakage technicians to respond.”

Software only as good as the data going into it

While Nicol explains that the system faces a challenging and confusing process of “learning normal” amid changing customer usage patterns off the back of pandemic lockdowns, he deems the project a success thus far.

“As we were the first to do this it’s difficult to set targets based on so many unknowns,” he says. “It would be unrealistic for us to expect the software to alert us to absolutely everything straight away, also we don’t want a lot of false alarms.

“The system is constantly being optimised by the analysts in the control environment and the machine learning is constantly updating the prediction so the longer it runs the better it gets. We have got some of the most reliable and good quality data coming back from our sensors.”

As far as broader lessons go, Nicol explains that work thus far has fundamentally reasserted the industry adage that “data is king.”

“Without the near real-time data you cannot have an intelligent network,” Nicol says. “The prediction won’t be good enough if you are only getting 15 minute average data every hour.

“You need the connectivity first and then you need a reliable sensor and good technical teams and support to maintain them,” he adds. “The software and the prediction is only as good as the data going into it.”