Analog Compression Standard For European Hdtv example essay topic
With these systems, a scene could be recorded, played and edited immediately, and then transferred to film. As a consequence, many of the intermediate delays in conventional film production were eliminated. The new medium also offered a number of possibilities for special effects not possible in conventional film production. Following the introduction of HDTV to the film industry, interest began to build in developing an HDTV system for commercial broadcasting.
Such a system would have roughly double the number of vertical lines and horizontal lines when compared to conventional systems. Now, the most significant problem faced with HDTV is exactly the same problem faced with color TV in 1954. There are approximately 600 million television sets in the world and approximately 70% of them are color TVs. An important and critical consideration is whether the new HDTV standard should be compatible with the existing color TV standards, supplant the existing standards, or be simultaneously broadcast with the existing standards (with the understanding that the existing standards would be faded out over time). There is precedence for both compatibility and simultaneous broadcast.
In 1957, the US chose compatibility when developing the color TV standard. Although there were some minor carrier interference problems due to the additional chrominance signal -- to a large extent, both monochrome and color TVs could read the same signal. As a example of simultaneous broadcast, consider Britain. Monochrome broadcast began in Britain in 1936 with a 405 line standard.
In 1967, the 625 line PAL color standard was introduced. The color and monochrome standards then operated in parallel for fifty years. In 1986, when the 405 line service was terminated, so few 405 line monochrome monitors remained that it was seriously considered that Parliament simply purchase 625 line monitors for the remaining 405 line users, as that was considerably cheaper than maintaining the 405 line service. (This amusing idea did not happen however, due to possible political repercussions!) II.
Basic ideas for HDTVThe basic concept behind high-definition television is actually not to increase the definition per unit area... but rather to increase the percentage of the visual field contained by the image. The majority of proposed analog and digital HDTV systems are working toward approximately a 100% increase in the number of horizontal and vertical pixels. (Proposals are roughly 1 MB per frame with roughly 1000 lines by 1000 horizontal points). This typically results in a factor of 2-3 improvement in the angle of the vertical and horizontal fields. The majority of HDTV proposals also change the aspect ratio to 16/9 from 4/3 -- making the image more 'movie-like'. The following table summarizes a few of the more conventional analog HDTV proposals in comparison with existing TV systems [2].
(Note Grand Alliance and other fully digital proposals are not included in this table.) Name Prog. Total Active Vert. Horz. Opt.
Asp. Vert. Horz. freq. or lines lines res. res. view ratio field field MHz inter. dist. NTSC p 525 484 340 330 5 4/3 12 deg 16 deg 4.2 prog. PAL i 625 575 290 425 6 4/3 10 deg 13 deg 5.5 con. SECAM p 625 575 400 465 4.3 4/3 13 deg 18 deg 6 prog NOTE: The aspect ratio of the picture is defined to be the ratio of the picture width W to its height H. The optimal viewing distance (expressed in picture heights, H) is the distance at which the eye can just perceive the detail elements in the picture..
Issues in HDTV. Bandwidth limitations A best, one cycle of an analog video frequency can provide information to two pixels. (NOTE: This is AT BEST -- it can easily be argued that one cycle only provides full video information to one pixel!) A conventional NTSC image has 525 lines scanned at 29.97 Hz with a horizontal resolution of 427 pixels. This gives 3.35 MHz (assuming 2 pixels per video cycle) as a minimum bandwidth to carry the video information without compression.
If one decides to move to an HDTV image that is 1050 lines by 600 pixels (keeping the same frame rate), then this means a bandwidth of 18 MHz. Clearly we have a problem here -- as the current terrestrial channel allocations are limited to 6 MHz! (As an aside, the word 'terrestrial' as used by TV people means conventional wireless TV transmission. This is to differentiate it from satellite or cable.) The options for terrestrial broadcast (assuming a 20 MHz bandwidth) are roughly as follows: 1. Change the channel allocation system from 6 MHz to 20 MHz.
2. Compress the signal to fit inside the 6 MHz existing bandwidths 3. Allocate multiple channels (2 with compression or three without) for the HDTV signal Options 1 and 2 are virtually incompatible with current NTSC service. About the only possibility for maintaining compatibility is simultaneous broadcast of NTSC information over certain channels and HDTV information over other channels. Option 3 does allow compatibility -- as the first 6 MHz of the signal could keep to the standard NTSC broadcasting and the remaining be additional augmentation signal for HDTV. Typically, in this type of augmentation system, an existing VHF channel would be tied to one (or two) UHF channels.
The VHF channel would carry information similar to the current NTSC signal and the UHF channel (or channels would carry augmented high resolution information). B. Distribution -- terrestrial? satellite? cable? Advocates for HDTV systems fall into two major categories. There are those that feel that these systems will ultimately be successful outside the conventional channels of terrestrial broadcasting. Equally vehemently, are those that think HDTV can and must use existing terrestrial broadcast channels. NTSC terrestrial broadcast channels are essentially 6 MHz wide. Service in a given area (roughly a 50 mile circle around the broadcast station) is typically offered on every other channel in order to avoid interference effects.
A relatively small range of channels are available (channels 2-69, 55-88,174-216,470-806 MHz). In 1987, the FCC issued a ruling indicating that the HDTV standards to be issued would be compatible with existing NTSC service, and would be confined to the existing VHF and UHF frequency bands. In 1990, the FCC announced that HDTV would be simultaneously broadcast (rather than augmented) and that its preference would be for a full HDTV standard (rather than the reduced resolution EDTV). These two decisions are very interesting, as they are almost contradictory. The 1987 decision is clearly leaning toward a augmentation type format -- where the NTSC service continues intact and new channels provide HDTV augmentation to the existing. The 1990 decision is a radical and non-conservative approach -- one which basically removes the requirement for compatibility by allowing different HDTV and NTSC standards to exist simultaneously for a period of years.
Then the NTSC is gradually faded out as the HDTV takes over. Now, the FCC does not have jurisdiction over channel allocation in cable networks. Thus, there is the rather interesting question of what the cable TV companies will do. They have a number of interesting options. They can continue to broadcast conventional NTSC, they can install 20 MHz MUSE-type HDTV systems (or other types of HDTV systems), or they can go with the digital Grand Alliance systems. This presents the interesting possibility of two different HDTV standards, one for terrestrial broadcast and one for cable broadcast. C. Interlaced versus non-interlaced.
The maximum vertical resolution promised by a particular TV system is greater than the actual observed resolution. The reduction in resolution is due to the possibility of a picture element (pixel) falling 'in-between' the scanning lines. Measurement gives a effective resolution of about 70% of the maximum resolution (the Kell factor) for progressively scanned (i.e. not interlaced) systems. If the image is interlaced, then the 70% factor only applies if the image is completely stationary. For non-stationary interlaced images this resolution falls to about 50%. Interlacing also produces serrated edges to moving objects, as well as flicker along horizontal edges (glitter) and misaligned frames.
As a consequence of the many problems associated with interlacing, a number of the HDTV proposals are for progressively scanned (not interlaced) service. Notice that these apply both to new ideas for HDTV, and to upgrades of the existing NTSC, PAL and SECAM systems as well. (Although initiating progressive scanning on conventional service does create compatibility problems, some of these techniques offer improved performance to NTSC / PAL/SECAM without the associated problems of moving to 'true' HDTV.) D. Compression Even if extra channel space is available -- it is usually not enough for the very wide bandwidths of HDTV. As an example, the current NHK satellite broadcast system in Japan (the only 'in-service' HDTV system) requires 20 MHz, but only has 8.15 MHz available per channel from direct satellite broadcast. Thus, some type of compression is typically required. Interestingly enough, although these compression schemes result in analog signals -- they are digitally implemented.
Thus, the line between 'digital' and 'analog' HDTV begins to blur. 1. Signal compression in the MUSE system The MUSE currently used for satellite HDTV service in Japan is a modification of the NHK HDTV standard for direct broadcast satellite service. The wide bandwidth of the NHK HDTV system is too large for the 8.15 DBS service.
As a consequence, the signal must be compressed. The NHK HDTV signal is initially sampled at 48.6 Ms / 's. This signal controls two filters, one responsive to stationary parts of the image -- one responsive to moving parts. The outputs of the two filters are combined and then sampled at the sub-Nyquist frequency of 16.2 MHz.
The resulting pulse train is then converted by to analog with a base frequency of 8.1 MHz [3]. What is happening here is that the sub sampling results in successive transmission of signals representing every third picture element. Thus, three adjacent picture elements in the receiver actually represent three successive scans of the same line. Stationary objects are not bothered by this, and appear at their full resolution. However, moving objects do not reoccur in their proper positions and create a smearing effect. This is not a real problem with moving objects in the scene (as the human eye is not very sensitive to this either).
However, it does present a problem during camera panning, where the overall image suffers about a 50% drop in resolution -- while the human eye does not. 2. Signal compression in the MAC system The MAC system was originally proposed as the analog compression standard for European HDTV. Under the original plans, HDTV broadcasts using MAC would be standard in 1995. However, for a variety of reasons, MAC did not make it in Europe [4]. In fact, MAC has died so hard that Europe may simply wait until the US develops an all digital HDTV standard and then use a 50 Hz modified version of it.
(As an aside, an interesting situation occurs with European HDTV systems. The peripheral vision is much more sensitive to contrast and movement than fovea l vision. As a consequence, the 50 Hz field rate (25 Hz frame rate) has been found to be too slow. The edges of a 50 Hz HDTV image will flicker.
Thus, most European HDTV systems advocate 100 Hz.) However, in spite of the political death of MAC, the technological aspects of the compression are very interesting and worth knowing about. Basically the MAC (multiplexed analog components) compression system fits the luminance and chrominance information into the horizontal line scan in a sequential way. In other words, the R-Y information is sent on one scan, and the B-Y on the next scan. The color difference and luminance information is sent in a time multiplexed fashion. Looking at the signal in time, the first part of the signal is audio information, followed by chrominance (R-Y or B-Y), followed by luminance [5]. In order to get the signal into this form require some serious digital processing.
Initially, the luminance, R-Y and B-Y signals are sampled and stored digitally. The luminance is sampled at 13.5 MHz and the color difference signals at 6.75 MHz. Then a 3/2 compression on the luminance and a 3/1 compression on the chrominance is performed. Now, the three signals are read out to produce pulse trains, and then back converted into analog form.
The time compression resulting from this operation allows them to be time domain multiplexed in order to fit within the 64 uS horizontal scan time. IV. MUSE -- or how the Japanese have gone toward HDTV As of today, Japan is the only country actually broadcasting HDTV services. Approximately 30,000 receivers and 100,000 converters have been sold to customers of this service [6].
It is widely believed that the establishment of this analog broadcast service essentially eliminates the possibility of starting a digital satellite HDTV service in Japan. The history of this begins in 1968, when Japan's NHK began a massive project to develop a new TV standard. This 1125 line system, is an analog system which uses digital compression techniques. It is a satellite broadcast system which is not compatible with current Japanese NTSC terrestrial broadcast. (This actually makes a lot of sense for Japan, as they are a single group of islands easily accessed by one or two satellites).
The MUSE system as originally developed by NHK was a 1125 line, interlaced, 60 Hz, system with a 5/3 aspect ratio and an optimal viewing distance of roughly 3.3 H. The pre-compression bandwidth for Y is 20 MHz, and the pre-compression bandwidth for chrominance was 7 MHz. As time has passed, this standard has been altered and upgraded. The various standard MUSEs are summarized below [7]. lines per field rate Y C C aspect frame bandwidth bandwidth bandwidth ratio - wide - narrow NHK-1980 1125 lines 60 Hz 20 MHz 7 MHz 5.5 MHz 5/3 MUSE 1986 1125 lines 60 Hz 20 MHz 6.5 MHz 5.5 MHz 5/3 SMPTE 1125 lines 60 Hz 30 MHz 30 MHz 30 MHz 16/9 1987 (studio) In considering how to broadcast this signal, Japanese engineering immediately rejected conventional vestigial sideband broadcasting (i.e. broadcasting methods similar to NTSC). They immediately jumped to the idea of using satellite broadcast (no doubt helped by the geography of the Japanese Islands, which economically support satellite broadcast.) The Japanese initially explored the idea of FM modulation of a conventionally constructed composite signal. (This would be a signal similar in structure to the Y / C NTSC signal - with the Y at the lower frequencies and the C above.) Approximately 3 kW of power would be required, in order to get 40 dB of signal to noise for a composite FM signal in the 22 GHz satellite band [8]. This was virtually incompatible with satellite broadcast!
So, the next idea was to use separate transmission of Y and C. This drops the effective frequency range and dramatically reduces the required power. Approximately 570 W of power (360 for Y and 210 for C) would be required in order to get 40 dB of signal to noise for a separate Y / C FM signal in the 22 GHz satellite band [9]. This is much more feasible! There is one more power saving that appears from the character of the human eye. Lack of visual response to low frequency noise allows significant reduction in transponder power if the higher video frequencies are emphasized prior to modulation at the transmitter and de-emphasized at the receiver. This method was adopted, with crossover frequencies for the emphasis / de -emphasis at 5.2 MHz for Y and 1.6 MHz for C. With this in place, the power requirements drop to 260 W of power (190 for Y and 69 for C) [10].
As mentioned earlier (see the section on compression) -- the problem of fitting the combination Y / C signal into the 8.15 MHz satellite bandwidth was solved by digital compression. Summarizing the previous discussion, the NHK HDTV signal is initially digitally sampled at 48.6 Ms / 's. The resulting pulse train is then converted by to analog with a base frequency of 8.1 MHz [11]. V. The Grand Alliance -- all digital HDTV and where the US is going from here. History: In 1987, the FCC issued a ruling indicating that the HDTV standards to be issued would be compatible with existing NTSC service, and would be confined to the existing VHF and UHF frequency bands. By the end of 1988, the FCC had received 23 different proposals for HDTV or EDTV standards. These were all analog (or mixed analog / digital systems like MUSE) and explored a variety of different options for resolution, interlace and bandwidth.
On May 31, 1990 General Instrument Corp. submitted the first proposal for an all digital HDTV system. By December 1990, A TRC announced its digital entry, followed quickly by Zenith and AT&T, then MIT. Thus there were four serious contenders for digital HDTV, as well as a modified 'narrow' MUSE and an EDTV proposal. During the following year, these systems were tested.
In February 1993, the FCC made the key decision for an all digital technology -- but could not decide among the four contenders. Therefore, after some fuss, a recommendation was made to form a 'Grand Alliance' composed of AT&T, GI, MIT, Philips, Sarnoff, Thomson and Zenith. This Grand Alliance would take the best features of the four systems and develop them into an HDTV standard. Most of the remainder of 1993 was devoted to establishing the features of this new standard. During 1994, the system was constructed and 1995 is slated for testing. If all goes well, the FCC may be setting this standard by the end of 1995. B. The basic standard The Grand Alliance standard differs from all existing TV standards in three major ways.
First, it is all digital standard -- to be broadcast with a packet transmission. Second, it supports multiple formats. Third, it is designed to be primarily compatible with computers rather than existing NTSC televisions. Summarizing the various formats [12]. active lines active aspect ratio frame rate in horizontal Hz pixels 720 1280 16/9 progressive 24, 30 or 60 1080 1920 16/9 interlaced 60 1080 1920 16/9 progressive 24, 30 Spectrum reports that 'all the formats are supported with NTSC frame rates, 59.94 Hz, 23.97 Hz, and 29.97 Hz. ' C. Compression The compression algorithms use both a motion compensated and discrete cosine transform (DCT) algorithm.
The motion compensation exploits temporal redundancy. The DCT exploits spatial redundancy. MPEG-2 syntax will be used -- because it is already well established, will aid in world-wide acceptance, and will smooth the road to computer and multimedia compatibility. Audio will be supported by Dolby AC-3 digital audio compression.
This will include full surround sound. The core of the Grand Alliance concept is a switched packet system. Each packet contains a 4-byte header, and a 184 byte data word. Each packet contains either video, audio, or auxiliary information. For synchronization, the program clock reference in the transport stream contains a common time base. For lip sync between audio and video, the streams carry presentation time stamps that instruct the decoder when the information occurs relative to the program clock [13].
The terrestrial transmission system is a 8-level vestigial sideband (VSB) technique. The 8-level signal is derived from a 4-level AM VSB and then trellis coding is used to turn the 4-level signals into 8-level signals. Additionally, the input data is modified by a pseudo-random scrambling sequence which flattens the overall spectrum. Cable transmission is by a 16-level VSB technique without trellis coding.
Finally, a small pilot carrier is added (rather than the totally suppressed carrier as is usual in VSB). This pilot carrier is placed so as to minimize interference with existing NST C service [14]. The Grand Alliance system is clearly designed with future computer and multimedia applications in mind. The use of MPEG-2 will permit HDTV to interact with computer multimedia applications directly. For example, HDTV could be recorded on a multimedia computer, and CD / ROM applications could be played on HDTV systems.
Previous: NTSC Next: Audio CD-Intro Consumer Electronics Homepage.