What are suppliers really like?

Energy is sometimes said to be a “homogeneous” product, the same whichever company supplies it. So, if customers don’t switch to suppliers with lower prices, this must indicate “weak customer response”.

An alternative view is that there are in fact different products and different kinds of tariff, and different qualities of customer service. With over 50 retailers active in the domestic market – and a constant flux of new entrants and exits – most customers have little or no firsthand knowledge of most of the suppliers. If customers don’t leave their present supplier, it’s because they are satisfied with its price and service or because they are unsure whether another supplier promising a lower price will offer better or worse service.

In a competitive market, companies establish reputations for better or worse service and value for money, for “never knowingly undersold” or for “pile ‘em high and sell ‘em cheap”. But this takes time: supermarkets have been in business for over one hundred years, whereas the domestic energy market has been open for only twenty years.

Can anything be done to speed up the establishing of reputations of energy suppliers? So that customers are not panicked into switching to the cheapest supplier, or petrified into staying with their existing supplier? So that all customers can make better informed decisions as to whether to switch supplier or stay loyal to their present one?

An Overall Customer Satisfaction score

I have proposed an Overall Customer Satisfaction (OCS) score to do precisely this. Rather than try to develop an entirely new measure, the idea is to average four existing ratings, each of which measures an important aspect of a supplier’s performance that deserves to be taken into account in assessing customer satisfaction. These four component ratings are as follows.

First, a measure of customer complaints and complaint handling based on the statistics that Ofgem collects and reports each quarter, presently for about 44 companies. Three statistics on number of complaints and speed of handling are combined into a single number out of 100.

Second, the annual ratings of the Consumer Association in its Which? magazine, based on its interviews with around 8,000 customers, which combines customers’ overall satisfaction with their likelihood to recommend that supplier. Typical coverage is about 30 suppliers.

Third, the quarterly ratings of Citizens Advice, which has a statutory remit to publish energy supplier performance data, and rates energy suppliers across five different metrics, as far as possible using objective data. The total number of suppliers rated has increased from 28 suppliers in March 2018 to 40 suppliers in March 2020.

Fourth, the views of customers themselves as expressed on Trustpilot, a consumer review website where customers rate the companies from one to five stars and give their views about whatever impresses or concerns them. Trustpilot calculates a time-weighted average of these customer stars to give a single TrustScore for each company, presently from one to five (in this paper expressed as a percentage). TrustScores are recalculated (and publicly available online) every time a new review is filed, so are constantly evolving. Trustpilot presently covers about 100 domestic energy suppliers, with in total over 500,000 customer reviews as of August 2020.

All four ratings relate to features that customers or their advisers feel are relevant and important. But they are quite different and mostly have little or no statistical correlation with each other. The OCS score can only be calculated for suppliers that are well enough known to have featured in all four ratings. So even just having an OCS score is an indication that the supplier has attracted a sufficient number of customers to be widely assessed. At any time, between 21 and 30 companies qualified for inclusion. But this is not to say that all these companies got a high OCS score or offer sufficient customer satisfaction.

Does size matter?

Understandably, many customers prefer a supplier they have heard of, which will typically be an existing large supplier but might include a few medium retailers. Is this consistent with the customer satisfaction scores calculated here? Yes and no, as Figure 1 shows.

Yes, insofar as the median medium supplier in the league consistently scores more highly than the median large or small supplier. No, insofar as the median large supplier consistently used to score much lower than the median small and medium supplier. But … most large suppliers have significantly improved their scores over the last year, to the extent that the median large supplier now scores more highly than the median small supplier and is not far below the median medium supplier.

So, if you have to choose by size alone, go for a medium size supplier – not surprisingly, because they have proved their ability to attract and retain several hundred thousand customers over a period of time.

However, why choose on the basis of size alone? There are significant differences between suppliers of similar size. Any particular one may be much higher or lower than the median. Size alone is not sufficient to guarantee a high customer satisfaction score.

OCS league tables

Table 1 shows the OCS league table as calculated on three dates over the last two years. It is divided into four divisions, reflecting a balance of quartiles and natural breaks.

Back in May 2018, Bulb was outstanding, with a score of over 88. Ovo, Utility Warehouse, Bristol Energy and PFP Energy were also in Division 1, with scores over 75. British Gas with nearly 60 was the only large supplier to make Division 2, the other five were in Divisions 3 and 4.

Over time, scores and standards generally increased: initially a supplier with a score of 60 was in Division 2, but two years later it was in Division 4. Some of the large suppliers rose up over time, notably EDF, British Gas and SSE into Division 2 and Eon to the top of Division 3. Despite increased scores, Scottish Power and Npower remained in Division 4.

Division 1 comprises the outstanding performers, with Avro, Octopus and So Energy maintaining their positions there for more than a year, and Bulb still there since the beginning. Outfoxthemarket and Pure Planet are new entrants at the top; Cooperative Energy has significantly improved its position over time.

In contrast, Ovo and Bristol have slipped from Division 1 to 2 to 3, Ecotricity from Division 2 to 3 to 4.

Some of those who have performed relatively poorly over time have since left the market, such as Green Star, iSupply, Solarplicity, Economy Energy. In fact, no supplier that has stayed below 50 has lasted in the market. Indeed, after this league table was calculated in August 2020, Tonik Energy fell further and then left the market. Together Energy is still there, having recently taken on Robin Hood Energy and related suppliers, but its score of 43.6 is very low.

What about their tariffs?

Do suppliers with high OCS scores tend to have higher or lower tariffs than other suppliers? Competition on fixed tariffs tends to be on price and there is no obvious impact of OCS score. But there are two particularly interesting findings about standard variable tariffs.

Between May 2018 and February 2019, before the imposition of the default tariff cap, suppliers with higher OCS scores had, on average, lower variable tariff price – by about £5 per year per OCS point. So a difference of 20 points in the OCS score meant, on average, a tariff that was £100 a year lower.

The default tariff cap significantly reduced tariff dispersion. Many suppliers simply set their variable tariff equal to the cap. The above relationship no longer obtained. But a few suppliers have consistently offered variable tariffs with a saving of more than £100 on the default tariff cap. And those suppliers – principally Avro, Bulb, Octopus and So Energy – have consistently been in Division 1 of the OCS league since they first joined.

This seems an important and encouraging development. The OCS scores enable customers to identify those suppliers that offer higher satisfaction as defined by customer bodies and customers themselves. They can thereby better identify suppliers that it is worth switching to, and suppliers to whom it is worth being loyal. A bonus, then, is to find that customer loyalty to the highest scoring suppliers pays off more explicitly. The ability to switch is critical, but it is not necessary to keep changing supplier or tariff repeatedly to get the benefits of competition.

Stephen Littlechild is Emeritus Professor, University of Birmingham; Fellow, Cambridge Judge Business School; Associate Researcher, Energy Policy Research Group, University of Cambridge and was the first Director General of Electricity Supply (1989-1998).