There is no denying that consumers, whether individual or business-related, are driving the massive expansion of mobile data consumption supported by enhanced HSPA+/LTE networks and services. Any performance issues at the device or network level (and the corresponding customer dissatisfaction) quickly find their way to the media with wide circulation. Given the increased complexity of LTE (with its multitude of new enhanced features and requirements), rigorous testing of all equipment before market introduction or deployment is becoming even more important.
Comprehensive lab-based performance and interoperability testing of devices and network equipment, where real-world environments and scenarios are replicated, enables suppliers to develop and launch products that meet end-user expectations. It is no surprise that performance testing for a complex LTE environment has to be a multifaceted process. Any rigorous testing regime needs to consider interoperability, data rate and throughput testing, impact of signaling, audio quality, as well as antenna and radio performance.
Interoperability: Different countries have unique frequency band availability for LTE. Additionally, LTE devices must continue to interoperate with legacy standards when necessary. LTE equipment will need to undergo performance testing to ensure that it can work effectively in noncontiguous frequency bands and across several networks and countries, including support for data roaming.
Data rate and throughput: LTE-A standards will enable data rates of up to 3Gbit/s per sector, delivering the equivalent of a fixed-line broadband experience to the end user. Achieving data rates up to 3Gbit/s over a mobile network poses a huge challenge for the wireless industry and will require thorough pre- (and post-) deployment testing to ensure maximum throughput and end-user satisfaction.
Signalling: The number of subscribers and applications are increasing. This, in turn, is generating huge amounts of data and signaling traffic between the handset and the network. With so many users and applications, it is imperative that service providers understand the potential impact of mobile applications on the performance of a device.
Audio quality: LTE has been predicated on the promise of high-speed mobile broadband access, but subscribers will still expect to make voice calls on their devices, so audio quality needs to be assured. The success of services like HD voice and voice over LTE (VoLTE) depend on device and network performance, and audio quality performance testing is a critical component of LTE testing.
Multiple antenna and radio configurations: With the advent of LTE, many new multiple input/multiple output (MIMO) antenna configurations have become part of the standard and, therefore, the testing regime. All these configurations must be tested under varying RF conditions and parameters to ensure quality of service.
With the rapid development of LTE standards and equipment, the market is focusing on performance testing in addition to LTE-specific conformance and interoperability testing. Many operators have cited performance as a defining element in the deployment of their network. If performance testing programs are designed to consider a wide range of performance issues, then LTE -- with its promise of superior levels of quality and performance -- will have a much better chance of successful deployment.
Sticklers will no doubt object to my claim that AM radio is 5 KHz. In the US, the RF channel width is 10 KHz, however AM radio in the AM broadcast band uses double sideband. That makes audio quality limited to 5 KHz in theory, and often no better than 3.6 or so KHz in practical receivers.
Also, during the day, it's possible for AM radio stations to expand their channel width. But in practice that buys you very little. Most AM receivers aren't set up to use the wider band. And at dusk, all of that goes out the window anyway.
As one who finds all voice telephony quality to be pathetic, whether land line or cellular, analog or digital, I'm also interested in seeing drastic improvements. I can only imagine how much more useful teleconferncing would be, if we weren't tied to traditional telephony standards.
From the days of analog telephones, the voice bandwidth was limited to a 4 KHz channel, so that means a practical passband of about 3.3 or perhaps 3.5 KHz. As bad as most AM radio in practice, although worse than AM radio in principle (where channel width is 5 KHz in the US).
That's why voice telephony sounds like the other guy is covering his mouth all the time.
When digital telephony came to be, that 4 KHz channel width was deemed good enough, and the standard 8 KHz sampling rate used in digital telephoney made darned sure that we wouldn't get anything better.
HD voice doubles that passband to 7 KHz. That should improve matters a lot. Although Skype, unhampered by any of this, can sound even much better than HD voice (assuming a decent broadband connection and that both sides have the latest codec).
LTE has been predicated on the promise of high-speed mobile broadband access, but subscribers will still expect to make voice calls on their devices, so audio quality needs to be assured. The success of services like HD voice and voice over LTE (VoLTE) depend on device and network performance, and audio quality performance testing is a critical component of LTE testing.
I am wondering about the differences in voice quality between voice over LTE and voice over 3G. Can anyone give us any pointers? And what about HD voice? How is that different from other voice?