Last year Magnus Jamieson wrote us a blog asking how we could build a climate-resilient energy system. We asked him to give us a bit of an update and discuss some more of his research and so he has written about how much a power cut costs.
So how much does a power cut cost? Well, it really depends who’s asking. To answer, I’ll start with a simple example, if someone were to ask you what the “expected” value of a dice roll would be, how someone answers can be very different depending on context and who is answering. If you’re playing snakes and ladders among children, to the pessimist it’s probably “the number that will take me to the mouth of the snake and lose me the game”. Playing monopoly with economists, it’s “whichever value takes me to the tile with the most expensive rent in the game”. The “correct” answer is 3.5.
This seems counter-intuitive, but if you were to just get the average by adding up all the values and dividing them by the number of possibilities, that’s what we get. This is also known as an “expectation”, and how we interpret this has profound implications depending on the context in which we apply it. In power systems, this is one way of helping working out how much more stuff we need to build to prevent blackouts. Take a given component on the electricity grid – say an overhead line or a generator – over a fixed period of time, and divide the number of outages over that time scale to get an expectation of how often that device would fail in the same conditions over the same time frame.
To come back to the original question then. In the UK this is described by a number known as “Value of Lost Load”, typically shortened to VoLL. This is partly found by surveying lots of stakeholders and balancing up factors such as how much people would be willing to pay to not lose their power, versus how much they’d be willing to pay to restore power if it was lost. In the UK, it works out to around £17,000 per MWh “lost”. For context, to buy a “unit” of electricity (1kWh) costs around 10p for a domestic user.
This places heavy emphasis, then, on the accuracy of your “averages”. Depending on the time frame, the expected failure rate, naturally, will be of very different importance and accuracy. For instance, if you were to consider an evening during winter in the middle of a storm – say during storms Desmond or Doris – you would likely find that far more things break on the system than during a tranquil, breezy midsummer’s evening. However, if you were to take the expectation of the failure rate over the entire year, the weight of the number of outages that happen during the midsummer’s day would be exactly the same as that of during near-hurricane wind speeds and torrential rain, even though the impact of the one-day storm is vastly more significant than the other 360-odd days in the year which don’t experience such storms. “On average”, this is fine – but try telling that to the regulator when you lose thousands of customers and millions of pounds because a substation floods for 3 days.
If we want to really understand how storms affect the power system we need ways to model storms on the power system and ways to figure out how to mitigate or prevent these outages in a cost-effective manner. That’s where my research is focussed. I’m investigating methods which look at how storms impact the power system so that we can evaluate the different ways of reducing these impacts. Essentially, simulating extremely stormy conditions and comparing the energy lost to what we expect to happen on the system. Once we know how the system would react today, it allows to us to plan for the future and optimise our solutions to prevent storm-related outages.]
For instance, if there is a major storm happening between England and Scotland, we could reduce the power flow between Scotland and England and increase generation either side in case a line is torn down by the wind. This might cost more in terms of dispatch, but there is less chance of a sudden outage due to loss of power. How we find that balancing point between power flow and generation optimisation is an open question, as is the chance of a given line being downed by different storm conditions. And that’s where my work begins.
Essentially, I am modelling the effect of different scenarios on the UK network by feeding in weather events to a model of the Northern part of the GB network and comparing their effects. Using weather data from NASA, I’ve created different representative weather “days” representative of different conditions to feed into the model. They are shown below.
The aim is to compare the effects of different extremities of storms on different areas of the network to be able to compare directly the effects of storms over space and time Being able to “replay the tape” on such storms allows us to look at how we can prevent blackouts in the future and devise strategies to ensure our approaches are cost-effective while still allowing us to watch TV on Christmas Eve if a storm takes out a line in Perthshire. If we have an idea of how many outages a given storm condition is likely to cause, we can plan developments and operation to minimise this as well as having an idea of the kind of costs we’re going to be dealing with.
So, how much does a power cut cost? As much as you’re willing to pay to stop it or clean up after it, really.