June 2008 back to HDTV index
I hope this explains what's really going on in HDTV.
You don't need to know any of this to enjoy a great picture and sound, but you will need to understand this to wade through the snake oil if you're shopping for equipment.
Feel free to skip ahead if anything gets too boring.
Off-Air Signals (antennas)
Few people use antennas. If you have cable or satellite, skip to Formats.
Standard (Analog NTSC) TV
Believe it or not, the on-air analog signal in the USA today is perfectly compatible with every TV made ever made. The FCC mandates that, thank God. Have a set from the 1941? It works perfectly from an antenna or plugged into your cable box.
There's just one catch: These signals go off the air on Tuesday, 17 February 2009. (People don't realize this, but these signals were originally slated to go off the air in 2001!)
Traditional TVs will get nothing but static from an antenna when real TV goes off the air in 2009. No big deal, since there are two ways around this:
1.) Almost no one gets TV from an antenna. Even my reactionary mom, who still refuses to have a microwave oven, cordless or touch-tone (pushbutton) phone in her house, has cable TV. (Mind you she only got it when New York City's World Trade Center, on top of which all the TV transmitters sat, was unexpectedly demolished in 2001.)
Most people have at least basic cable, which doesn't change. If you watch cable or satellite, nothing changes.
2.) If you really do watch with an antenna, there are "converter boxes" which will convert the new digital off-air signal so that you can watch it on your old TV. There are government hand-outs if you really want one. I'm told you can ask at Wal-Mart and Radio Shack and they know what's going on. (Low-power translator stations will stay on, so it beats me how the Feds are going to sell-off the public airwaves as planned.)
There's a huge potential problem with converter boxes: the picture may no longer fit the screen and half your picture may be black, depending on what you're watching. You'll have to try it. Sorry.
3.) I'm serious that few people use antennas. When I was helping a station in Hawaii get on-air with digital, they admitted that if it weren't for the silly FCC mandates of having a digital transmitter, it would be cheaper for them to pay for cable TV for the few people who would watch a digital on-air signal than to have to spend millions and millions of dollars on a new transmitter and pay the electric bills for all thousands of watts of power needed for all the radiation which comes off the TV tower as signal.
Digital ATSC TV
Believe it or not, we've been broadcasting digital TV signals for free in all the major markets since the 1990s. No one knew, since very few sets can tune these in!
In a major blunder, the FCC still hasn't mandated all sets to have digital tuners.
In another blunder that set the industry back years, the FCC didn't bother to specify the format (resolution). The FCC decided to ignore this, and let stations guess what sorts of TVs people had. Worse, it left all of us in TV studio equipment and program production lost, since we had a chicken-and-egg problem of figuring out for what resolution to make the equipment! This is why we've been on the air for almost ten years with digital, and few people know or care.
More about what you get on digital TV at my Source Material page.
A very important point is that digital broadcast TV doesn't mean high definition. Digital TV simply means digital, and stations may broadcast any formats and resolutions they wish.
The broadcast Digital TV (DTV) format provides for up to four standard definition signals (subchannels) to be broadcast on each channel.
29.97 vs. 30 FPS
Black-and-white TV ran at 30 FPS.
Due to some intricacies with harmonics of the new subcarrier introduced for color, the frame rate was reduced slightly to 29.97 FPS.
Thus in TV, 24 FPS is really 23.98, 30 FPS is 29.97 and 60 fields are really 59.94 Hz.
Don't laugh: in Hollywood we usually use the more conventional rates when speaking amongst ourselves, and we know what the real rates are. When outsiders visit from large corporations trying to break into TV, they've gone back and developed products to run at these colloquial 24 FPS rates, which were useless since the rate really is 23.98 FPS!
Consumers can forget about this. I'll use both versions throughout these articles depending on how tired my fingers get. Honestly, though, if you work in Hollywood you need to pay rapt attention to this, since things like drop frame or not can make or break a job.
Interlace (i) and Progressive (p)
Movies and TV don't move. Each is a series of still images. When enough pictures are seen in succession, things appear to move, like drawing pictures on pages of a book thorough which you flip.
TVs and movies would flicker like crazy if each frame was shown by itself. The frame rates, about 24FPS, are too low, since our eyes can see each frame as its own blink.
TV pictures are made of horizontal scan lines. They are numbered from top to bottom. (In Hollywood we get personal and talk about each line by name, like "line 21" or "line 8," which aren't even part of the visible picture.)
Lock your fingers together in front of you. That's interlace. If you look at your fingers from top to bottom, you've got a finger from each hand as left, right, left, right, etc.
In conventional analog TV, we fixed flicker back in the first half of the 1900s by splitting each frame into 2 fields of either odd or even lines. We send each field twice as often. There are 60 even or odd fields per second which make 30 full frames per second. Each field draws alternating lines, just like your interlocked fingers. 60 fields per second fast enough that our eyes don't see any flicker.
TV cameras have fluid motion because they record in fields You get time resolution of 60 fields per second, which is 2.5 times as smooth as movies at 24 FPS.
Progressive are formats that scan each line progressively (in order) from top to bottom. These tend to have slightly better spatial (detail) resolution, while interlace formats offer better temporal resolution (smoother motion).
You may ignore this as a consumer. Whatever program source you're watching is in only one format. These choices are made by the producers. This isn't a consumer choice, unless you're making your own shows.
SD TV is what we've had since 1941.
"High" definition is relative. The current standard definition systems were described as High Definition when they were developed in the 1930s, compared to the lower definition systems that came before. Those lower-definition 1920s systems were also called HD when they were invented.
Most of what's broadcast, even over HDTV stations, is simply upconverted SD.
In the digital domain (ITU-R BT.601), SD is 720 x 483 non-square pixels, but in the consumer world, laypeople call this "480." In professional broadcast, we call this "525" or NTSC. In the US, the standard signal has 525 total lines, but since you can't see the black lines above and below your screen, there are only 483 active lines with picture on them. Computer people shortened 483 to 480.
In many countries, PAL has been used instead of NTSC. That format is called 625 or PAL in professional use. I'm unsure what phrase is used in the consumer world.
High Definition (HD and HDTV)
The most common program formats are called 720p and 1080i. Engineers have argued these for decades, but they each as good as far as a consumer is concerned. I'll get to display formats later, which is what you consider when you get an HDTV. Read on for technical details, or feel free to skip ahead if this bores you.
720p is a format whose images have 720 horizontal lines scanned completely (progressively, the "p" in 720p) 59.94 times per second. Each horizontal line has 1,280 pixels to make a 16:9-shaped image 1,280 pixels wide by 720 pixels tall.
1080i has images with 1080 horizontal lines (pixels) delivered only 29.97 times per second. Only half the lines (pixels) are sent for each TV field 59.94 times per second.
Each horizontal line has 1,920 pixels to make a 16:9 shaped image 1,920 pixels wide by 1,080 pixels tall.
720p versus 1080i
If you multiply the pixels by the complete frame (not field) rates, you'll discover each has about the same amount of pixels per second.
This is why engineers can argue endlessly about which is best. Engineers will argue forever because they forget first to set the ground rules on which to base their pronouncements of "best."
Here's how to tell which is best technically for your production. 1080i sends a complete image only half as often as 720p, but each of those images have double the pixels.
720p excels when one needs to see details while in motion, while 1080i is the ticket for most images where details are more important when things are holding still. Since our eyes can't see details as well for moving subjects (try reading a moving newspaper), I'll vote for 1,080i, as do most productions.
There is no such thing as Full HD, FHD or FHDTV. "Full HD" is just a marketing word.
720p and 1080i are both HD.
Most plasma, LCD and DLP TVs have only had resolutions of 1,280 x 720 pixels up until about 2008. None of them could fully resolve all the detail in 1080i programs, even though they are all "compatible" with 1080i because they can display pictures from those sources.
Instead of admitting the cheat, savvy marketing people turned yesterday's cheat into the opportunity to take more money from you today.
Now that sets with native resolutions of 1,920 x 1,080 pixels are becoming common, these marketing people created the euphemism "Full HD" to describe these 1080i (native) sets, and trying to make you think that the 720p sets they sold you last year need to be junked. This is hilarious: before 1,920 x 1,080 sets came out, the 1,280 x 720 resolution figures were always carefully hidden.
I cover the differences between 720p and 1080i sets at Picking the Best HDTV.
In Hollywood, we use 1,080/23.98p (1,080 lines, progressive scan, 24 frames per second) to create master transfers from film. Some Blu-ray players can support this output, which is elegant.
Movies, national TV ads and the good national shows are shot on film, and film runs at 24 FPS. (Talk shows, crappy shows, local TV and live events are shot on video).
It just happens that scanning the complete image progressively at 24 FPS allows us to convert these files to either of 1080i, 720p, 525 NTSC or 625 PAL without losing anything. This saves Hollywood from having to do four separate transfers of the same film!
Movie DVDs are created from these 24p transfers. Most all DVD players have a "progressive scan" feature so that one can get about double the real resolution at home from a movie DVD compared to a DVD from a video source. This is one of the many reasons why even standard DVDs look fantastic on HDTVs.
There are a zillion other HD formats, resolutions and frame rates. Unless you run a post-production house where you need to ingest, convert and deliver programs and elements in every possible format, as a consumer you may ignore all of this.
The formats above subsample the color information. Hollywood often produces in a format with higher chroma bandwidth, called 4:4:4, at the same rates. You won't see these signals outside of Hollywood; they are used in the processes of creating filmed content, for instance, in digital intermediates (DI) used when shooting film, editing electronically, and returning back to film. It moves around on two coaxes called "dual link" or SMPTE 372M.
A program's content and how it was made is far more important to viewing quality than it's technical format.
Program versus Display Formats
These are program formats. These choices are made by the producers.
All HDTVs can watch any of these formats.
HDTVs vary slightly in resolution (720 or 1,080), but they all can show SD, 720 and 1,080 programming. I'll cover 720 vs. 1,080 displays under How to Pick the Best HDTV.
A telecine is a scanner used to convert the original film to video. All the good stuff you want to watch on TV is actually shot on film. Only crappy reality shows, talk shows, local shows and the news is shot on video. National ads and good weekly shows with potential for future syndication are shot on film.
In Hollywood, a telecine machine costs about $1,000,000, and is designed into its own room, the telecine suite, with color correction and monitoring tools.
It costs about $700 an hour to rent a telecine suite with an operator. That operator, the colorist, tends to get paid about $250,000 a year. He is a skilled artist.
It takes about ten hours of work to transfer one hour of film, thus a movie might cost about $10,000 to transfer.
Telecine is a very painstaking, precise and artistic process that's done frame-by-frame. It's where the magic happens that lets video shot on film look so much better than video shot on video cameras.
It's trivial to up-scale programs in conventional analog or digital SDTV to HDTV.
Many DVD players can do this, and your HDTV does this automatically to get SDTV pictures to fit its screen.
All this this is doing is stretching smaller pixels into bigger, softer ones to fill a larger screen. It never makes the picture sharper or adds details that weren't there.
There's nothing wrong with upconversion, although it is not HDTV even though it will light the "1080 HD"light on your set.
Because of a lack of HD content, most HDTV stations and HD cable channels are just broadcasting upconversions most of the time.
Worse, for many live sporting events, not all the cameras used to create the show are HD. The SD cameras are each upconverted to HD before they hit the "switcher" which selects among the various cameras. Now when you watch the show, most looks OK, but some shots have an awful cartoon look to them.
When upconverted professionally, images have sharp outlines, but lack real details and textures, just like cartoons. This drives me crazy if a show switches among different cameras.
An upconverted image, either from the TV network, cable channel or your DVD player, is not HD.
Upconversion from non-HD images is cheating, but the real problem is with black borders.
Compression is what causes flocks of multicolored ants jumping around when the picture moves or changes. Compression is what causes all those little squares as a picture fades to black. Compression is what causes all the weird pixelations. It's not your TV, this is the garbage that comes over TV today.
Sadly, what we do in Hollywood stays in Hollywood.
Even if Disney wanted to send you a digibeta telecine master, you wouldn't want to pay for the deck, much less the blank tape (about $50) or the security you'd have to provide.
It costs money to move around a lot of bits. When you transport data, you pay by the bit. When you have to rent time on a satellite to move TV images around, you try to buy as few bits a you can without viewers complaining.
When you're a cable TV company, you try to jam as many channels down the cable as you can.
When you're a satellite TV company, you likewise try to use as few bits as possible to cram as many channels as possible over your digital satellite to sell to people's subscription services.
Broadcast digital TV has plenty of bits, but no one watches off-air, and most of the programming coming to the local TV station has already been compressed getting there.
Compression means throwing away most of the bits. A standard-definition picture, in full SMPTE-259M studio quality, needs 270 megabits per second. An HD picture needs well over 1Gb/s!
A DVD uses very expensive off-line, multiple-pass compression and can get the data rate down to about 5-10 megabits per second. These look great.
DVDs look almost identical to their master tapes. The worse I've seen simply have some noise around titles, and otherwise I've seen, even back in the 1990s, Hollywood demonstrations with a DVD and a D-1 telecine master played side by side on studio monitors, and they looked the same.
So far, so good. SD DVDs look great at 10 MB/s.
The problem is in TV land, where you can't get 10 MB/s. First, you can't get realtime compression to look as good as offline multipass compression, so the same 10 Mb/s won't look as good as a DVD. Secondly, few people want to pay for 10 Mb/s lines to move programming from facility to facility anyway.
Your cable and satellite company probably crams each SD program down to only about 1.5 MB/s. This is why we have noise and motion problems. When I watch cable TV, the picture wulaity is often about the same as the MOV files from my Casio EX-V8 pocket camera, which just happens to use about the same data rates.
Want to know something that fooled all of us in Hollywood when we first started using data compression? Cartoons are the hardest to compress well! It turns out that sharp edges on flat backgrounds both excite a lot of edge noise, and make it obvious against the flat backgrounds.
If you want consistently great pictures, forget watching anything other than DVDs and carefully selected HDTV shows. All because the show says "HD" doesn't mean it looks good.
Nothing moves in movies and TV. Each are a series of still images (frames) which look like they're moving when seen in rapid sequence.
Motion looks as it does depending on how something is shot, and how it's transmitted to your home.
Motion problems come from the level of MPEG compression in your source and how it got transmitted though twelve links eventually to your set. Motion issues are almost never related to your choice of set.
Contrary to what people trying to sell you a TV say, your TV has nothing to do with rendering motion. All CRTs, plasma, LCDs and DLPs redraw the entire picture every time; there isn't anything to lag unless you're watching an LCD in an unheated igloo. If you are watching an LCD in an igloo, any lag goes away as soon as the set heats up from the backlight.
Pictures jump and aren't smooth for many reasons. Let's explain them.
Movies, national commercials and serious TV shows likely to have a long syndication life are shot on movie film at 23.98 frames per second. We usually call this 24FPS.
It's supposed to be a little jumpy, or have "judder," as transferred to video. Even in a movie theater (film or digital cinema), look at people's arms as they move around. It's not fluid, smooth motion. Arms become wide swaths.
24 FPS movies and 30 FPS video would flicker like crazy on the movie screen or TV. Therefore movie projectors have a double-bladed shutter that projects each frame twice, and TV is transmitted as two alternating, interlaced fields for each frame. Thus film projects as 48 blinks per second, and TV as 60 fields per second. Each TV field is either the odd or even numbered lines, and when they're drawn on the screen, they interlace, like the fingers of both hands being put together.
24 FPS film is transferred to 60 field video using 3:2 pulldown. This means that every other frame of film is converted to either 3 or 2 TV fields. (Pulldown means that the film is pulled-down to the next frame after being scanned for either 3 or 2 fields.) Now each 2 film frames, representing 1/12 of a second, has created 5 TV fields, also representing 1/12 of a second.
This is great for broadcast TV, but if you stop-frame your VHS machine, this is why half of the time you get a frame where the action is wiggling back and fourth. This happens when you're stopped on a TV frame which consists of a field from two different film frames, which is every other TV frame.
This look is one of the reasons people shoot on film! In my more innocent days, I thought everyone wanted fluid, smooth, live motion images. Then I visited Filmlook in Burbank and realized why people pay them a lot of money to take smooth video and replace it with the jumpiness of film (as well as grain and everything else).
Movies and TV are art. They're trying to tell a story. TV, film and video are not life. The more direct and lifelike the images, the less likely our brain is to take us away to whatever fictitious place is being created in the film. A movie is a work of fiction and imagination. Film grain and jumpiness serve to remove the image from reality, after which our brains fill in the cracks.
This mental filling-in is how a movie can take us away from reality and off into its own world. It sounds silly to engineers, but remember, people pay Filmlook a lot of money to add these artifacts back in. Things shot on video look amateur, and people pay Filmlook to make smooth video capture look like it was shot on film.
Movie DVDs won't have fluid motion like your home videos. They aren't supposed to.
Sony's 120 Hz TVs
When watching video created from film, which is how the best programs are created, some film frames get 2 fields and the other half of the frames get 3 fields This can occasionally produce an effect called judder, which is slightly jumpy continuous motion.
Judder has been with us since television was invented. I've never had issue with this effect.
Sony has some TVs touting a 120Hz refresh rate. Sony's claim is that these sets present each field twice. For film-based material, Sony claims that these sets are smart enough to re-order things so that each frame is displayed for an equal time period, five of these fields In this way, each frame gets equal time on-screen, eliminating judder.
I haven't tried this. If I'm to guess, it probably makes very little difference to enjoying a movie.
Television shot on video cameras can have smooth perfect motion, so long as the cadence doesn't get screwed up somewhere in the lengthy and treacherous processes between the camera and your house.
Regardless of how something is shot, the data compression used in many stages between the source and your TV often adds all sorts of motion artifacts.
The reason live sports may have sucky motion or look horrible when there's a fast pan or zoom is because somewhere along the signal chain, someone set an encoder to throw away too much information. Probably several links along the chain aren't using enough data for picky viewers, so the picture gets worse at every stage.
MPEG compression is used for just about every path in the transmission chain. You pay by the bit for transport. THerefore you adjust your compression systems to use the lowest bit rate you think is good enough. Often, you figure no one can see, and set it lower to save money.
I'm not kidding; this data compression versus picture quality issue is a reason I sold PQA testers that cost $50,000 when I was at Tektronix.
Here's another thing not known to consumers: another way TV networks reduce bandwidth to save money is to lower the resolution of the image! THis is only of many settings in MPEG encoders. Instead of transmitting even the standard definition signal of 720x483, it's easy to drop the resolution and save money.
A picture doesn't go from the camera to your TV. It has to go about thirty different places and get compressed and decompressed many times along the way.
Tektronix' free "A Guide to Video Measurements" is the reference used throughout Hollywood to understand the HDTV signals used professionally. You also can find that information here. These references are used by people in the industry to figure out why signals are or are not plugging and playing in a studio or while designing equipment. These aren't likely of any use to a consumer.
John Watkinson: The Art of Digital Video. Don't let the word "Art" fool you: this is the standard reference book for design engineers. This is a book for people with engineering degrees, not consumers. If you want the best deep explanation of everything, this is the book that sits on my shelf. The fourth edition came out in April 2008. I have the second edition.
Charles Poynton's books are also excellent, but above the level of consumers.
SMPTE is the technical body which governs these technical standards. If you want to meet the guys who do this for a living, attend a local chapter meeting.
This website site is how I support my growing family.
If you find this as helpful as a book you might have had to buy, please help me to continue helping everyone.
This page is free to read, but copyrighted. If you haven't helped me yet, and wish to make a print of this page, please help me with a gift of $5.00.
Thanks for reading!