Monitors © 2005 KenRockwell.com
CRT: Traditional Big, Deep, Heavy Monitors
CRTs are the huge, heavy, deep, traditional monitors. Most today have flat fronts. Usually they are reasonably consistent from brand to brand, so long as you never have played with any of the settings you probably won't need to calibrate it. They come out of the box pretty good but will drift over time.
CRTs, being analog vacuum tubes, vary so much that they have to be individually adjusted as part of the manufacturing process. That's good, because if you leave them alone as they were shipped you're OK. Heaven help you if you or your engineer neighbor twiddle with the RGB settings without a calibrator. There is usually no way to get back to the factory settings, and now you have to buy a calibrator.
Factory-fresh CRTs give more consistent color brand-to-brand than LCDs if you don't have a calibrator. I was always too cheap to buy a Spyder for my desktop's CRT, although working in television I did calibrate it with a $10,000 Philips professional monitor calibrator used throughout Hollywood to calibrate video monitors. I set my monitor to the video standard of 6500 Kelvin and 30 foot-Lamberts of luminance. Photographers use Kelvin ratings, but have no standards for brightness, measured in foot-Lamberts. Oddly this $10,000 monitor tester and the $30,000 professional video monitors used in Hollywood usually only calibrate at three points: black, gray, and white. By comparison the inexpensive Spyder calibrates at hundreds of points for every color and shade of gray.
If you have a CRT and haven't played with any of the settings on it or your computer you're probably fine for non-critical work without this calibrator. If you can't get one of these calibrators then the software in the Mac OS (Apple Menu > System Preferences > Displays > Color > Calibrate) or Windows (Adobe Gamma) is fine.
LCD: Thin Flat Panels and Laptops
The good news is once calibrated they rarely drift. The bad news every maker and every sample will tend to look a bit different. Manufacturers try to make them more vivid than the next guy. This looks great for web browsing, but it's horrible for creating photos since it's unlikely anyone else's screen or your printer will match it.
Therefore I suggest calibrating each and every LCD. The great news is they really don't drift. I have the cheapest laptop Apple made 2 years ago, a 12" iBook, and I see no change every other month when I calibrate it.
I used to use a big, heavy standard CRT 22" Viewsonic PF815 I bought in 2001. Today I'm looking to get a big Apple Display to gain more space to work. LCDs are great, but you really ought to get a calibrator for consistency.
A big monitor allows you not only to see your image, but also have enough room for all the palettes and menus on the sides at the same time. A small 17" monitor is perfect for email, the Internet and word processing, but for photography a bigger screen makes everything much easier since you do need a lot of extra room on the sides of the screen for all the palettes. Of course when you are done working and blow an image up to full screen it's impressive; just remember the point of the big screen is to allow you to work on an image while still being able to see more than a few inches of it.
When shopping for monitors remember that CRTs are always screwed up in the store display because everyone messes with the controls. The only way to tell anything is to buy one and bring it home. Monitor models change as often as everything else in computing and each of Viewsonic, SONY, Barco etc., have ten models in every size so what I did and suggest was to call up each maker, tell them what I was doing and ask them to suggest a model.
LCDs usually look pretty much the same in the store as as they will at home. LCDs don't have as many adjustments to screw up!
When you use any of the calibration tools they ask you for weird things like Gamma and color spaces. Which should you choose?
In TV where I make my real money we have had standards for resolution, color space and contrast for many decades. It's easy to make images that look pretty much the same on every TV since they all are shooting for the same standard.
Computers are different. There are so many standards for color spaces that effectively there is no standard.
But wait, there is one standard which is most common.
I strongly suggest unless you have a very, very good reason to do otherwise that you set your monitor to work in the sRGB color space, which means setting gamma to 2.2 and color temperature (white point) to 6,500K. By doing this you will be adjusting your images to the most common standard, and therefore it's most likely that your images will look just as good when seen on other monitors or you have prints made.
My monitors are set to this and when I see my website on everyone else's computers the images pretty much look as they did at my house. That's the whole point of standards.
The default for most monitors right out of the box is a bluer color of 9,300K. When setting your monitor to 6,500K it will look a little duller and oranger, but live with it for a few minutes and you'll see that it's actually a lot more neutral than the standard setting which usually looks blue and cool.
I set my Mac to gamma (lightness or contrast) of 2.2 since it's the standard for the Internet. Macs default to a lighter looking gamma of 1.8 that is standard among graphics and printers, but since my medium is the Internet I stick to 2.2.
Monitor color space and working color space are different. If you use a broader working space like Adobe RGB then monitors cannot display the all colors you may be able to represent in the file, kind of making working in a broader space a bit silly.
Actually, I've tried the wider gamut working spaces like Adobe RGB and quickly discovered that the range of my Epson printer is still the limiting factor so after all the work everything still looks like it did before all the color space fooling around. I could explain for hours why theory tells us you can get broader color ranges with other than sRGB, however in practice it's not relevant. Unless you are working in the professional publishing and printing press business or know so much that you aren't even reading this page stick with sRGB.
I giggle when I read people getting all excited about all this. Regardless of how you set your monitor things pretty much work out so long as you have the brightness and contrast set correctly and nothing's broken.
Also on Mac I suggest you go into Internet Explorer, go to PREFERENCES > WEB CONTENT > PAGE CONTENT and check USE COLORSYNC. Now your computer will correct all the images you see on the Internet to look the way they should. Sorry, the windows version of explorer has no such ability.
DVI and the Apple Display Connector
Few people realize that even today, 99% of the connections we all make between our computers and monitors is still the same old analog VGA connection. Yes, the VGA interface has always been an analog component format.
This is great for conventional CRTs, but a problem for LCDs and DLP monitors and projectors because LCDs and DLPs actually have individual pixels drawn on them, unlike a CRT where a beam just runs around on a tube. On an LCD or DLP one needs to address each and every pixel independently for the best sharpness.
Today's external LCD monitors have silly adjustments like "clock phase" on them to attempt to synchronize a sampling clock internal to the LCD monitor so that the LCD system can digitize and process the analog information. Because of this kludge you often will see slight horizontal blurring or jitter on external LCDs since the LCD does not always digitize everything perfectly. (I used to be an applications engineer for LCDs and also the company that made the D/A and A/D converters used for all this.) It can't digitize perfectly because there is no way to synchronize the sampling clock to the pixel clock in your PC. It usually works OK, but not always and usually takes some set-up time to adjust well.
Laptop screens always look good because laptops address each LCD pixel independently. There is no VGA cable needed since the monitor is part of the laptop computer.
Common sense tells us these DVI connections are a good thing since they offer better quality and save the need for today's redundant D/A converter in the computer and A/D converter in the monitor.
Apple Computer realized this years ago and even my 2000 vintage G4 has a digital monitor output called the Apple Display Connector (ADC). It's a little different from DVI with the same intent. It even allows driving analog CRTs without any of the potential for ringing or other problems in a long VGA cable, since the D/A converters are in the Apple CRT monitors and the signal over the cable is digital.
HDMI, used in consumer DVD players potentially, is driven by Hollywood's concern over people duplicating DVDs if the data was sent in the clear to the monitor. Hollywood does not want a repeat of the CD where so many otherwise honest people unwittingly become criminals since it's trivial to burn copies of music CDs. Burning copies of music CDs for your friends is a violation of federal law (title 17, United States Code) unless it really is just a backup for your own use. As the judge said in the Napster case, even if technology makes committing a crime easy, it's still a crime. Hollywood wants to ensure it's never trivial to duplicate your DVDs.
Home Gallery How-To Links Workshops About Contact