For transcoding, there are basically three main types.
- Transcoding (traditional reference) -- involves the conversion from one compression format to another, such as when converting MPEG-2 to H.264. This method involves the most changes to the original content; codec tools, image size, frame rate, and bit rate.
- Transrating of content -- using the same compression format, but lowering the bit rate of the original content to allow it to be transmitted, stored, or used by a less capable device.
- Transcoding of image parameters -- using the same compression format (same profile, different level), but reducing the original image size and frame rate to allow for playback on less capable devices.
There are several reductions that result from transcoding, namely the following:
- HDD storage: For STB with PVR functionality, transcoding MPEG-2 content to H.264 can potentially reduce the storage capacity by 50%. For MPEG-2, good quality can be attained at compression ratios from 30:1 to 50:1 while for H.264 similar quality can be obtained between 60:1 and 100:1. Hence a 2x improvement in compression efficiency results in a 50% reduction in HDD storage space. Although the consumer will always reach capacity on any HDD, transcoding will double this capacity over MPEG-2 allowing the
consumer to store and access more content on the PVR over time.
- Bandwidth utilization: for bandwidth limited pipes, this will allow either more channels to be transmitted or will enable transmission that would not be possible otherwise.
- Need for storing multiple files: AV media servers often store multiple files of the same content at different image resolutions and bit rates to allow viewers to access the content depending on the capabilities of the device and bandwidth available. An example of this is seen when playing movie trailers where the user is asked to select small, medium, or large for the desired viewing preference. Being able to transcode in real time eliminates the need for multiple files being stored, thereby reducing file storage on the server and unnecessary complexity on the part of the server by knowing which file to use and transmit.
- Need for supporting multiple formats: Over time, compression formats continue to emerge, both standards based and proprietary. It's a moving target. With PC applications tending to be more proprietary and broadcast applications tending to be more standards based, there is a wide range of codecs that are already in use. In time, this will continue to increase with newer standards continuing to evolve such as China's AVS and follow-on by the ITU for the next H.26x.
These reductions are further pronounced with HD where the savings from transcoding are very substantial and in some cases, such as with bandwidth limited pipes, would enable capabilities that would otherwise be inoperable.
A/V Container Formats and Network Protocols
But it's not just about transcoding. There may be incompatibilities in container formats that devices use to store and transmit their AV content. So even with the proper transcoding, some devices may not be able to operate if they have incompatible container formats, even though the AV media inside the container is compatible. Hence there
needs to be a method to convert incompatible container formats among different devices to a common container format that one or the other can understand.
Yet there is still more to this problem. There are often incompatibilities in the transmission of the AV media, such that the network protocols supported by the different devices are incompatible. In particular, a media client that is connecting to a media server must be able to understand the format of the data that they send and receive. Obviously they each must support the same protocol (i.e., HTTP, RTSP, etc) and one or the other has to switch over (i.e., protocol roll over) to support a lowest common denominator of the available protocols of the two connecting devices. Yet even within the same protocol, there can still be incompatibilities where certain modes or features are not supported, preventing the content from being received and played back, resulting in a dissatisfied consumer.
Adaptive device functions
So to provide the consumer with maximum utility among all of his or her AV media devices, the need for an adaptive device that can provide three primary functions arises:
- Transcoding -- to resolve incompatibilities in compression formats, display resolutions, memory capacity, and processing power of different devices.
- Adaptive container format capability -- to create the desired container format supported by the client device that is performing the video display and audio playback.
- Adaptive network protocol capability -- to allow for compatibility in transmission/reception protocols and modes among different devices to ensure the accurate and reliable delivery of AV media content from one device to another.
Figure 5 provides an illustration of how these interrelate to each other.
Figure 5: Interoperability among AV devices
The rest of this article will address these three necessary functions and describe how they can be implemented into a single silicon device. In particular, using Texas Instrument's DaVinci family of processors, a single system-on-a-chip (SOC) can enable a much-needed capability for future consumer products to enjoy a long life in the dynamic environment of the home. We will first look at a model of an AV media client and server and see how it is architected for a typical system.
Next: AV Media Client/Server Model, Mux and Demux