Meanwhile, spacecraft like the Solar Dynamics Observatory launched in 2010 and orbiting instruments like NASA’s Sabre (Sounding of the Atmosphere Using Broadband Emission Radiometry) sensor have begun to unravel at least some of the sun’s mysteries by detecting and recording tremendous flares that drive solar weather.
While better space weather forecasting will help, Baker said the best engineering approach is to find ways to harden vulnerable space systems and terrestrial networks. “The best thing would be to make yourself immune” to the effects of space weather, he added. “We’ve got to get on with it.”
Hence, looking for ways to harden future satellite electronics along with the existing power grids and other critical infrastructure will be the central focus of preparedness efforts as the solar activity grows over the next 18 months.
Another approach involves developing self-repairing computers based on FPGA designs that can tolerate higher doses of space radiation, detect faults and remain in operation during heavy space weather. These “blocking technologies” could also be used to protect power grids and other susceptible networks on Earth. The purpose of these hardened components is to reduce the chances of a single-event failure that could shut down a navigation satellite or bring down part of the power grid.
While no one knows for sure, space weather forecasters are clearly worried about the extent of the next solar maximum. Hence, the growing emphasis on boosting space weather forecasting. The pair of Radiation Belt Storm probes launched by NASA on Aug. 30 represents the next big step in understanding and preparing for inclement space weather.
The twin satellites mark the first time NASA has launched a mission specifically to investigate the Earth’s Van Allen Radiation Belt, a layered ring of charged particles, or plasma, held in place by the Earth’s magnetic field.
The probes will make detailed measurements of the radiation belts and how solar flares cause them to change and affect the upper portions of the Earth’s atmosphere. "The information collected from these probes will benefit the public by allowing us to better protect our satellites and understand how space weather affects communications and technology on Earth,” said John Grunsfeld, NASA’s associate administrator for science missions.
Just the same, better keep an umbrella handy when the solar max arrives as early as next year.
The take-away for me, from the NASA video, was that there's a tradeoff with CO2 (and NOx) that I was previsouly unaware of. An "inconvenient truth" for Al Gore, essentially.
As much as CO2 might act as a greenhouse gas inside the atmosphere, it also shields Earth from these solar storms. Which says to me, there's a balance between heat allowed in from the sun, and heat kept in.
Don't know exactly what the sensitivity of this mechanism is to CO2 levels. It would be really nice to see some of this "bigger picture" get some of the usual press hype.
As to the effect on electronics in space, these have been well known for some time, I think.
We saw something of a corollary to this problem during the powered descent of Curiosity on Mars on Aug. 6. The rover carried two wind sensors on "mini-booms". In the folded position on the rover's deck, one of the mini-booms was exposed to blast.
You guessed it: It eventually failed, most likely because one of its boards was damaged by Mars dust or pebbles kicked up by the rocket motors on the sky crane. The guys at JPL should be anticipated this, and put extra shielding on the exposed mini-boom.
So far, it's about the only thing they got wrong. The dust covers used to protect Curiosity's camera lenses all worked. The scientist in charge of the high-res camera said the other day that, for the engineers, the landing was "7 minutes of terror," but waiting to see whether the lens on her camera was protected from rocket blast was "30 days, or [Martian] sols, of terror...."
This is old news for people in the business. The problem is all the "smart guys" who run some risk analyses and convince themselves that their COTS-cobbled payload (the bus guys, they're not so risk tolerant) will meet the mission success probability number, and sell that story to the program watchdogs and the insurance company, and launch brittle junk that fails as soon as one assumption is invalidated. Such as, when your once-a-decade LET=80 ion shows up the first week you're on orbit and you sold the story that parts that latch up only over LET=40 have an acceptably low probability of it happening. Then your power supply fries and your billion dollar bird commences to tumble uselessly.
You can't afford to loft enough shielding to take the critical particle rate to zero. The parts have to be good if you need certainty. But too many folks would rather believe the statistics (because it makes the cost story work), never mind that your sample size is one when you get down to the care-about and that one is damn expensive. 3/4 the cost and dead before payback is not a winner, but the jokers who make that call will be promoted, with any luck, before the proof comes back to earth in a flaming ball of space junk.
If it is important, it better be shielded. Let's also hope watchdogs are in place to properly manage the devices (i.e. keep from going out of control) in the event that they don't experience a fatal blast, but only experience a temporarily disruptive blast. Interestingly, I saw a recent article about NASA planning to send up some cheaper satellites contolled by Android Phones (small in size and limited in scope).
Aside from purpose-built short life satellites why on earth would any responsible satcom or science satellite builder NOT use rad hardened systems and components??? If those designers think that COTS is the solution for all applications then they should find a different job...
Join our online Radio Show on Friday 11th July starting at 2:00pm Eastern, when EETimes editor of all things fun and interesting, Max Maxfield, and embedded systems expert, Jack Ganssle, will debate as to just what is, and is not, and embedded system.