Daniel Baker, director of the space weather laboratory at the University of Colorado at Boulder, warned that “for better or for worse, our space systems are moving toward commercial parts rather than radiation-hardened” components. Hence, “the changing nature of the electronics industry is moving us toward greater vulnerability in space.”
A looming problem is the susceptibility of space electronics to “deep dielectric charging” in which high-energy electrons in the Earth’s magnetosphere can penetrate shielding and bury themselves in dielectric materials. If a dielectric material used to make an electronic component is “leaky,” a charge can build up and discharge, causing satellite damage or failures.
Using more shielding, or what Baker called a “brute force approach,” usually doesn’t work. Instead, he advocated that spacecraft designers focus early on the physics on the dielectric materials, an approach that would allow them to “choose wisely [while] going in with your eyes open.” Designers also need to consider how dielectric materials used in satellite electronics change over time.
On the ground, smart grids designed to add intelligence to the power grid also will be vulnerable to the effects of space weather. As recent power outages in the Midwest and East Coast have shown, power disruptions can create havoc. A disruption of smart grids “could really be bad for society,” Baker warned, especially one as interconnected as ours.
Baker called “hardening” the smart grid the “poster child” for the age of space weather.
Indeed, the greater our reliance on networks, the more vulnerable we become to effects of space weather. A 2008 National Research Council report from a space weather panel chaired by Baker warned that given “the interconnectedness of critical infrastructures in modern society, the impacts of severe space weather can go beyond disruption of existing technical systems and lead to short-term as well as long-term collateral socioeconomic disruptions.”
Hence, agencies like the National Oceanic and Atmospheric Administration are now taking the effects of space weather into account in their forecasting models. Meanwhile, scientists like Baker are trying to learn as much as possible about the impact of space weather before the solar maximum hits Earth. To that end, NASA launched a pair of space probes in August to study space weather.
The take away here is that chip designers, especially hi-rel chip designers, are going to have to take the effects of solar flares in space into consideration when using vulnerable low power materials. And they will have to factor for how these materials might degrade over time.
Aside from purpose-built short life satellites why on earth would any responsible satcom or science satellite builder NOT use rad hardened systems and components??? If those designers think that COTS is the solution for all applications then they should find a different job...
If it is important, it better be shielded. Let's also hope watchdogs are in place to properly manage the devices (i.e. keep from going out of control) in the event that they don't experience a fatal blast, but only experience a temporarily disruptive blast. Interestingly, I saw a recent article about NASA planning to send up some cheaper satellites contolled by Android Phones (small in size and limited in scope).
This is old news for people in the business. The problem is all the "smart guys" who run some risk analyses and convince themselves that their COTS-cobbled payload (the bus guys, they're not so risk tolerant) will meet the mission success probability number, and sell that story to the program watchdogs and the insurance company, and launch brittle junk that fails as soon as one assumption is invalidated. Such as, when your once-a-decade LET=80 ion shows up the first week you're on orbit and you sold the story that parts that latch up only over LET=40 have an acceptably low probability of it happening. Then your power supply fries and your billion dollar bird commences to tumble uselessly.
You can't afford to loft enough shielding to take the critical particle rate to zero. The parts have to be good if you need certainty. But too many folks would rather believe the statistics (because it makes the cost story work), never mind that your sample size is one when you get down to the care-about and that one is damn expensive. 3/4 the cost and dead before payback is not a winner, but the jokers who make that call will be promoted, with any luck, before the proof comes back to earth in a flaming ball of space junk.
We saw something of a corollary to this problem during the powered descent of Curiosity on Mars on Aug. 6. The rover carried two wind sensors on "mini-booms". In the folded position on the rover's deck, one of the mini-booms was exposed to blast.
You guessed it: It eventually failed, most likely because one of its boards was damaged by Mars dust or pebbles kicked up by the rocket motors on the sky crane. The guys at JPL should be anticipated this, and put extra shielding on the exposed mini-boom.
So far, it's about the only thing they got wrong. The dust covers used to protect Curiosity's camera lenses all worked. The scientist in charge of the high-res camera said the other day that, for the engineers, the landing was "7 minutes of terror," but waiting to see whether the lens on her camera was protected from rocket blast was "30 days, or [Martian] sols, of terror...."
The take-away for me, from the NASA video, was that there's a tradeoff with CO2 (and NOx) that I was previsouly unaware of. An "inconvenient truth" for Al Gore, essentially.
As much as CO2 might act as a greenhouse gas inside the atmosphere, it also shields Earth from these solar storms. Which says to me, there's a balance between heat allowed in from the sun, and heat kept in.
Don't know exactly what the sensitivity of this mechanism is to CO2 levels. It would be really nice to see some of this "bigger picture" get some of the usual press hype.
As to the effect on electronics in space, these have been well known for some time, I think.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.