Consider that the main priority is for the com system to be reliable and understandable. a Khz bandwidth is neither, although you do hear a lot of them on the amateur radio bands. So it is a bit wider than 2. It is indeed probably compressed a bit to help it be clearer, and it may also be digitized and multiplexed. One other thing, which is that far more important is reliability.
In addition, by the time it gets to the broadcast people, it has probably been abused a whole lot on the trip down from space. Did you ever listen to digitized single-sideband? Not even close to natural. Then consider how rugged it has to be, and think: could you drive a nail with your cell phone as the hammer?
If we can get HD TV back from a geosynchronous bird, we could get decent audio from the shuttle--if we cared. The bottom line is that it does not affect the--you guessed it--bottom line.
It's just like the audio at the fast food drive up that has been atrocious for 60-odd years. When will fast food chains spend the money to improve it? When people quit patronizing their restaurants because they can't stand the crappy audio at the drive up. When will people quit patronizing their restaurants because they can't stand the crappy audio at the drive up? Looks a lot like never, at this point.
How many congressmen, senators or NASA executives have been seriously inconvenienced by the quality of audio from manned missions? When they are, the sound will improve.
It might be interesting to know that phone manufacturers, my employer in fact, have to inject "comfort noise" into their audiopath. If we didn't then the idle pauses where no one is speaking would be very uncomfortable.
So there is something to having "noisy" audio - it does indeed affirm reality in the communication path.
And if you believe that NASA thought about this when the designed their interstellar radio, then there's this bridge I know...
No - I think it was just Serendipity.
I have not heard it lately. It originated from the Motorola VHF equipment, which was naturally narrow banded, exactly like all the other PM communications at the time. It was still the same,
and might be still analog, but insted of VHF antennas on earth, we now use TDRS. Apollo definitely used PM or FM communications which was separate from the Unified S band signals.
My question, why do we have such poor quality cell phones ????
The modern NASA network is highly digital. The only actual "radios" on the ISS or space shuttle are used for landing (in the case of the shuttle) or emergencies.
The audio quality (or lack thereof, if you wish) is all in the microphone. Any aerospace application will tend to use mil-spec, tried and true tech whenever possible, and the (mechanical) noise cancelling microphone dates back to the 50's or 60's. These mics are inherently narrow band and have low dynamic range. They must be placed on the lips to work properly (move them just a few mm away and you can't hear the speaker). The good news is they work and are relatively cheap (emphasis on "they work").
There are several audio sources and paths on the space shuttle. Some audio sources are inherently better than others. In this case, it is the transmission path that determines the audio fidelity. The best path now is the high bandwidth Ku-Band satellite link. Recently, it was upgraded with a high-fidelity handheld microphone so the audio sounds as good as any media outlet on the ground. You have to actually watch closely to see the microgravity effects in the picture to tell it is not taking place on the ground. However, this link is not always available everywhere on orbit. The TDRS link is much more available. Included in that link are the air-to-ground intercom communications. They are indeed low-quality and audio-bandwidth limited to 2 kHz. This is due to the fact that the orbiter uses a 1970's technology digital multiplexer and transceiver set that has limited data bandwidth. The intercom audio is actually digitized and multiplexed with other non-voice data and transmitted to TDRS or ground via S-band radio. Due to the overall limited data capacity, the voice data had to be limited to the lowest rate possible. In the 1970's, psychoacoustic digital audio compression was not common nor inexpensive to implement with limited flight hardware and processing resources. The easiest solution was to limit the audio sampling rate to about 4 kS/s. This results in poor audio fidelity yet retaining intelligibility. Nowadays, the same bit rate could yield a higher fidelity audio output by using compression. This is widespread technology found in every cell phone and MP3 player. However, the costs of testing and qualifying a change to the critical voice channel in flight hardware and software at this point are prohibitive. Hopefully NASA will embrace the current audio compression technology and retain full-bandwidth audio for all channels, including intercom, for the next generation spacecraft.
Even if you designed every aspect of a high quality system for _FREE_, they would not upgrade it. The cost of the documentation alone is prohibitive. It's not for technical reasons, but bureaucratic. Stop thinking like an engineer :-)
The shuttle is based on 1960s technology with periodic upgrades to systems like avionics. As long as Houston has its telemetry down links, I doubt they worry much about audio quality. Remember, also, that many of the Apollo astronauts were amazed at how good comms was during lunar trips. About the only time there were problems with voice communications was during maneuvers like extracting the LM from the Saturn third stage en route to the moon. NASA has bigger fish to fry (like developing heavy-lift rockets) than upgrading audio.
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by