The DSO Incentive creates an uneven playfield for different-sized DSOs

Christopher Jackson, CEO and Co Founder of Advanced Infrastructure

Is the DSO Incentive Scheme producing winners and losers amongst the UK DSOs not purely on performance but size as well? Reports published over the past two years suggest that this might be the case. While the framework is designed to reward improvements, the financial outcomes are directly linked to the size of the customer base, a factor beyond the DSOs control. As a result, two operators delivering similar progress can experience different financial impacts, raising questions about how equitably the incentive operates across the sector.

How the incentive is calculated

Each year, Ofgem evaluates every licensed DSO’s performance across Great Britain and determines whether they have earned a reward or have to be penalised. The results are published in the Distribution System Operation Incentive Annual Report. The scheme is often discussed in terms of scores and rewards, but it also signals a broader shift in expectations – network operators are being assessed not only on reliability, but on their transparency, flexibility, and data provision.

The incentive is measured annually and built around two main assessments. The Stakeholder Satisfaction Survey measures how well each DSO engages with and responds to a variety of stakeholders’ needs. The Performance Panel Review involves an expert panel evaluating the evidence which was submitted by the DSOs, including the development of flexibility markets, conflict management, data transparency, and delivery of benefits.

Is satisfaction a good indicator of performance?

An interesting pattern emerging from the results is the variance between satisfaction scores, panel scores, and financial rewards. Satisfaction scores tend to cluster relatively closely, likely because respondents steer away from being overly critical. Moreover, engagement experiences can feel broadly similar across operators. As a result, satisfaction alone does not always differentiate performance strongly. The results show that every DSO improved year-on-year, with a threshold of 7.5 before a DSO can expect to be penalised.

LicenseeStakeholder Satisfaction Survey scores 2024/5Stakeholder Satisfaction Survey scores 2023/4
UKPN9.599.06
NGED9.037.77
SSEN8.537.42
EWNL8.867.94
NPg8.087.77
SPEN9.028.13

Panel challenges lack of evidence

Panel scores show greater variance because they test capabilities rather than sentiments. The panel is assessing the evidence and processes, for which DSOs are at different stages of development. Small differences in approaches to data, transparency or options assessment can lead to significantly different scores. The panel often challenges claims if they lack evidence, leading to the variation in scores.

The panel results show a clear trend, with UKPN and NGED (the two largest operators) at the very top. Notably, all six improved over the course of the last regulatory period, suggesting an overall development in delivery and evidence quality. This is also the more dependable benchmark of performance because stakeholder selection is not fully transparent and may introduce bias.

LicenseePanel Assessment scores 2024/5Panel Assessment scores 2023/4
UKPN9.368.91
NGED8.458.24
SSEN7.817.59
EWNL6.716.19
NPg7.346.58
SPEN6.085.08

Large variations in financial rewards

Financial rewards show the greatest variance of all. They are determined by the two assessment components and the size of the customer base served by the operators. Hence, small increases in satisfaction or panel scores can be worth more for larger operators like UKPN and NGED (each serving around 8 million customers) than for smaller DSOs, which serve roughly 3 million customers. The result is an uneven playing field: smaller DSOs have to deliver broadly the same upgrades but the financial upside is lower, even when performance improves by the same amount.

There is limited clarity in the published reports as to whether the scaling mechanism adequately adjusts for the differences in size between operators. This raises an important question: does the current policy design incentivise excellence equally across operators?

(Data Sources: 23/24 and 24/25 reports)

How LAEP+ is helping DSOs improve their performance

Over the course of the next assessment period it will be interesting to see what actions DSOs are taking to improve their scores. From our experience working with UK DSO teams, as reflected in their published reports, the direction is clear: more structured and data-driven planning that can be clearly evidenced against Ofgem’s criteria.

That is exactly where our LAEP+ software tool comes in. It helps network operators and local authorities align their understanding of future energy demand and infrastructure needs. By bringing together network and demand datasets, among others, LAEP+ enables more coordinated planning and clearer evidence for investment decisions. The shared environment for scenario modelling, data integration, and stakeholder collaboration helps DSOs explain the decision-making process. Supporting National Grid Electricity Distribution, UK Power Networks, Scottish and Southern Electricity Networks, and Northern Powergrid, has shown us how this approach is becoming central to delivering transparent planning and stronger performance as electrification accelerates.

 www.advanced-infrastructure.co.uk


This article appeared in the April 2026 issue of Energy Manager magazine. Subscribe here.

Further Articles