The power failure started in the afternoon - my buudy and I came back from a late lunch to a dark house and beeping UPS. So they wouldn't have seen the blackout spread, but as night fell over North America they would have noticed a lot of missing cities. Some parts of Toronto didn't get power back for more than a week.
It's comments like this: "What should have been a manageable local blackout cascaded into widespread chaos on the electric grid." That make me a) glad I have a generator and b) wish I were totally off the grid.
How real-time can this be if it takes 10 years to surface? This would have been a great story in 2003? But 2013? Its clearly product placement, but it doesn't even pass the red-face-test.
The excerpt from above, I find especially ironic: that it may "perhaps" avoid a blackout. They had monitoring that "perhaps" may have avoided it in 2003, and look how that worked out. Frankly, monitoring is a necessary condition, but not sufficient. The real problem is that we have been running the grid open-loop forever. Phasor Measurement Units are finally being deployed to where we can actually see what's really going on, but measurement isn't closed-loop control -- just the first step on the way.
Of course this is a little like the orchestra on the Titanic -- doing what we know how to do, instead of something that can actually change the outcome. Our entire load base has moved to Direct-Current, but we are still grapling with how to monitor (and maybe someday, close the loop on control) in the AC domain. When do we move on to the inevitable DC grid and integrated energy storage that we really need?
While going through the detailed report published on EDN, I noticed that it is mentioned:
" Operators were unaware of the need to re-distribute power after overloaded transmission lines hit unpruned foliage."
So, more than the "software bug", which is mentioned as the root cause for the failure, the reason for the incident looks more like an improper documentation or training for the users of the software to me.
Excellent observation. Monitoring from space is a possibility. With the right imaging technology, the entire US or a maginified portion of it (i.e. Northeast US and Canada) could have been monitored and light could have been seen going out in sequence----food for thought for power companies, Homeland Security, etc
@Some Guy---the 2003 blackout was the first time Genscape had this system in place. There have been many more blackouts since, of course, and the point here is not that that the technology exists, but who chooses to use it and work with companies like Genscape to refine the technology with a "smart" system geared towards prediction and prevention----I have not seen anyone do this yet.
Companies like Genscape have the tools, but if power companies choose not to use them or to expand upon their effectiveness, then we will continue to have blackouts.
Hello Sanjib.A---I believe that you are correct---training of the operators is paramount to the success of utility companies being able to minimize the effects of a sub-station failure or some other event that can trigger a "domino effect" and a blackout. The same goes for Nuclear power plant generation operators
Like most aircraft accidents and Three Mile Island, it was an accumlation of a bunch of "little" errors and loss of situational awareness by the operators that led to it. Behind it was also a poorly managed energy provider -- what they refer to in 8D problem solving as the root cause of the root cause.
A Book For All Reasons Bernard Cole1 Comment Robert Oshana's recent book "Software Engineering for Embedded Systems (Newnes/Elsevier)," written and edited with Mark Kraeling, is a 'book for all reasons.' At almost 1,200 pages, it ...