Fixing Unsharp Images
I was photo editor at a newspaper in the early 1980s.
We got all sorts of bad images, and the photographers never knew why. I became very good at figuring out why a photo wasn't sharp and helping the photographer get better next time.
Those days were simpler, since no computers were involved. Computers add their own issues on top of the photographic ones.
People presume unsharp images are caused by a defective camera or lens. 99% of the time it's photographer error!
Today's high resolution digital cameras are unforgiving of any error by the photographer. It takes a great deal of skill to extract every pixel of sharpness of which today's cameras are capable. Modern (2005 and newer) cameras have zero tolerance for photographer errors if you are looking for the greatest sharpness.
It is very difficult to see an image accurately, and it is equally difficult to make a sharp image.
First I cover getting your monitor or projector adjusted, then I cover camera issues. Getting monitors adjusted is a big potential problem often overlooked.
Look at digital images on a monitor. Prints add other factors which can obscure your ability to see the sharpness of your image itself. For instance, prints and printers will add or subtract sharpness.
ONLY VIEW AT 100%
Look at your images only at 100% on your monitor. Any other magnification obscures the image's own sharpness and misleads you into comparing software algorithms instead of the actual images.
Looking at camera images at 100% is usually much more magnification than printing an image. You'll see subtleties at 100% you'll never see in a print. You can see so much at 100% that you'll see many, many slight errors and sharpness-robbing effects that will never appear in print. You'll see so much that it's easy for anything other than absolutely perfect technique to degrade visible sharpness.
Digital images viewed at 100% on-screen are much less forgiving of photographer errors than film cameras. Film requires a microscope. Digital gives the same level of analysis simply by going to 100% magnification in Photoshop. With film few people had microscopes: even picky photographers stopped at 8x or 20x loupes.
At 100% magnification a typical 2006 digital camera makes an image which would be about 3 feet (1 meter) wide when seen in its entirety. Viewing at 100% is a very critical test of sharpness unprecedented before today.
If you look at images reduced to fit your screen you can't see the finest details.
Every software program uses different resampling algorithms to remap the actual pixels to the ones on your screen. Every program will have different apparent sharpness depending on their algorithms. You're comparing your software, not your images, if viewing at less than 100%.
Worse, Photoshop uses different algorithms at different magnifications! Photoshop looks good at even reductions, like 50%, 25% and 12.5%, but jaggy at odd reductions like 67% and 33%.
The same applies to viewing at greater than 100%. At greater than 100% you introduce even more variables from your software.
Every program interpolates differently. Photoshop usually does nearest-neighbor, so you see lots of little square pixels when blown up to 400%. iView resamples depending on how you've chosen your preferences, so you get big square pixels, or big fuzzy blobs.
Any time you view images above 100% magnification you can't get an image as sharp as at 100%.
LCD: Always set everything to the LCD's native resolution
LCDs have fixed resolution. Look at the screen very closely and you can see all the pixels. They don't move. You must have your computer set to the same resolution, otherwise pixels get lost in the translation. Your monitor and computer must both be set to the monitor's native resolution. Native resolution is the actual pixels on screen.
Most computers and monitors allow you to select other resolutions, all of which lose sharpness. The default settings are usually OK, but many people screw with them and lose on-screen sharpness.
Laptops do this correctly by default.
Modern Apple Macs all do this correctly by default.
Modern desktop Windows PCs are often set incorrectly to a lower resolution than native. People buy 1280 x 1024 19" LCDs, but set the monitor settings to 1024 x 768 to make text easier to read and web pages look bigger.
When you select a lower than native resolution, your computer attempts to interpolate the fewer pixels of your chosen resolution to cover the larger number of pixels on the monitor. Computers never resample very well, and always lose sharpness in the process.
Always run at the native resolution of an LCD.
Projectors have native resolutions, just like LCDs. You must set your computer and projector to the native resolution of the projector.
This applies to all DLP and LCD projectors.
Do not use any electronic keystone correction! Electronic keystone correction mushes pixels around to make the projector spit out a trapezoidal image, which looks straight on screen. Guess what? Projector pixels can't move. Electronic correction mushes each pixel across one one more real pixels electronically, and softens everything.You only get perfectly sharp images when every pixel coming out of your computer goes to one and only one pixel on the monitor or projector.
The only correct way to fix keystoning is to mount the projector properly. Professional projector makers know this, and use lenses that move mechanically, just like a view camera, to correct without having to mutilate any pixels.
You can see what evil happens with electronic keystone correction if you project the test pattern below.
Synchronize Your Pixel Clocks - VERY IMPORTANT!
Here's a little-known and very important fact: the standard VGA monitor connector from your computer is an analog output!
Your computer converts digital pixels to analog voltages which appear at the VGA connector. They flow through your monitor cable to your LCD monitor or projector. Your LCD monitor or projector then has to redigitize these analog voltages back to digits and pixels in order to display the image! A lot can get lost in these two conversions.
If your monitor or projector is off even by fractions of microseconds, you'll smear pixels horizontally and lose half your sharpness! We inherited this problem from the VGA connectors designed to work with analog CRT monitors when we advanced to LCD monitors. Analog CRTs never had digital pixels.
You leave your computer alone, so long as it's set to the native resolution of the monitor. You probably have to tweak your monitor to get optimum sharpness. Projectors almost always need to be adjusted.
It is critical that the sampler in the monitor samples exactly at the same time as every pixel is sent from the computer. There is no automatic synchronization signal, since traditional CRT (tube) monitors never needed this. Because of this, modern monitors and projectors vary wildly in their ability to lock and synchronize to your computer.
You need to find the setup menus in your monitor or projector and tweak them for every monitor or projector you connect. Once set, you'll see what a sharp image looks like and will know if you need to readjust them in the future.
Because these are analog signals you may have to readjust them again and again.
Good monitors do a great job of automatically optimizing themselves to your computer. Others require you go into a hidden menu and tweak the pixel clock until you get the optimum sharpness. Look at text, or ideally a test pattern with every other pixel at 100% black or white. This image below is perfect: every tiny pixel alternates like a tiny checkerboard. If you look closely you can see them. From normal distance this will look smooth gray. If your pixel clocks are a little off it will look horrible, with lines or regions of light and dark!
You should be able to see every alternating black and white pixel in this test pattern.
Hopefully your monitor or projector defaults to a good automatic adjustment, which will track and adjust the pixel clock without your intervention. Not all do.
ADC, DVI and Direct Connections
Laptop computers connect their internal monitors directly to the graphics card. They don't use external VGA connectors, so every pixel gets to where it needs to go. No problem!
Apple desktop monitors don't have this synchronization problem either. In about 1999 Apple invented a digital connector called the Apple Display Connector (ADC) to replace the old VGA connector. It sent the digital signal directly to the monitor, eliminating the foolish VGA connector and its problems with LCDs and projectors. All the Apple desktop monitors have had perfect digital clarity direct from the computer to screen ever since.
Windows PCs are adopting digital connectors, called DVI. If you use the DVI connector you also don't have to worry about synchronizing clocks. Apple today also uses the DVI standard.
I suggest anyone buying a projector for photography use only consider one with a DVI input and a laptop with a DVI output. All Apple laptops have DVI outputs. I can't say for Windows.
CRT monitors are easy. They are analog and aren't locked into any resolution, so no worries about synchronization. This is why the old VGA connectors never had any easy provision for synchronizing other kinds of monitors.
There's nothing to synchronize, digitize, or resample. That's the good news.
The bad news is that at typical resolutions they aren't as sharp as a well adjusted LCD.
If you want sharp, get any LCD, set your computer to its native resolution, and synchronize it well.
Keep analog VGA cables short. Higher analog frequencies (sharpness) are attenuated (made crummier) as the cable gets longer.
A six foot (2 m) cable is fine.
A 50 foot cable is asking for trouble. This is a big problem with auditoriums with lecterns and computers at the front and the projector in a booth at the rear. It's much better to put the computer next to the projector and control the computer remotely from the lectern.
Apple Display Connector (ADC) and DVI cables are digital and have no problem with sharpness. If they have a problem you don't lose sharpness: you lose the whole picture, or get weirder problems instead.
Digital camera pixels viewed at 100% aren't as sharp as scanned film or images that have been reduced. On-screen image from a digital camera at 100% will never be as sharp as what you see from most other sources.
Scanned film, and images reduced to fit the web like those on my Gallery pages, have full red, green and blue resolution for every pixel. They look as sharp at 100% as they do reduced.
Digital cameras all start with fewer pixels than claimed and spear them around with Bayer Interpolation to simulate the claimed resolution.
Bayer interpolation reduces sharpness. All digital cameras do this. Therefore, you'll never see a digital camera image look as sharp at 100% as you see from reduced images like those at my Gallery pages.
Here's an example of the most resolution you'll see with a digital camera image at 100%:
Roll your mouse over it to see how much resolution you could get if cameras used three CCDs and didn't need to use Bayer Interpolation.
Of course you could apply sharpening. That would make it sharper, but not increase the resolution. Here's the Bayer-interpolated image with added sharpening (150% at 0.3 pixels). Compare it to the non-interpolated image.
See my page on Bayer Interpolation for more information. These are as sharp as digital cameras get at 100% at their native resolution.
If you set your camera to a much lower resolution you can get non-Bayer-interpolated performance, with an image with half the number of pixels in each dimension. These look sharper at 100% on your screen, but of course have fewer pixels for printing.
If you've made it this far you now can see the sharpness in your camera's images instead of seeing computer artifacts or be worried about artifacts inherent in all digital cameras.
Camera images become unsharp because of:
If you see sharpness that seems greater or weaker in one direction than another, it's probably caused by motion.
People move. Kids move. Everything moves. Blow up to 100% and even small amounts of motion lead to lacks of sharpness. Look at something motionless.
Hands jiggle. Tripods get blown in the wind. This smears the entire image.
VR and IS attempts to reduce this when hand-held.
Slight blur that will be invisible in prints may be significant when enlarged to 100% on-screen.
SMALL APERTURES and DIFFRACTION
Apertures of f/8 and smaller blur the image due to diffraction. Anytime you look or photograph through small holes you get diffraction. Squint your eyes - that's diffraction! Look or photograph through a screen window or through a small f/stop and you get the equivalent of a soft-focus filter.
I have a page explaining Diffraction with these examples:
The biggest cause of soft images made on tripods are people blindly stopping down beyond f/8. This guarantees a softer image!
The best compromise between a small aperture for depth of field and a large aperture for better resolution becomes a complex compromise beyond the scope of this article.
Focus is critical. I often see very slight errors in focus. Even a slight defocus will deteriorate sharpness. Look around the image: is something else in sharper focus? If so, your focus is off.
Focus only happens in a plane. You cannot have items at different distances simultaneously in perfect focus, unless you use the tilts of a view camera.
I have a page on How Focus Errors Rob Sharpness.
Depth of Field
There is no such thing as depth of field Depth of field works presuming you'll permit a certain level of softness. If you're seeking the absolute maximum sharpness, there is almost no depth of field See also Selecting the Optimum Aperture, since you can't stop down recklessly due to the limits of diffraction.
AF System Errors
At large apertures it's not uncommon to see consistent errors in AF systems. I hand-pick every lens and camera body, and reject any that don't give reasonably correct focus. Some lens and body sample combinations are consistently off. For instance, a large percentage of the first production run of D70s in 2004 were adjusted just enough off to be consistently out of focus at f/4 and larger.
Camera-brand (Nikon, Canon, Pentax, Minolta, Leitz, Zeiss, etc.) lenses are almost always the best.
Off brand (Sigma, Tamron, Promaster, Tokina, Quantaray, etc.) lenses usually work fine for most photography, as you can read at Why Your Camera Doesn't Matter.
Using camera-brand lenses becomes significant if you're looking for the absolute best sharpness. See my comparisons at my Digital Wide Zoom Report.
All lenses can be coaxed to make sharp images. The best lenses do it more consistently, and mediocre lenses often fall short. All the lenses I tried in my wide zoom report cost at least $500.
Heaven help people who think they can pop a $100 no-name lens on a top camera and get all the sharpness for which they paid. Sadly there are way too many of these people.
You could get away with this in the early days of digital photography (before 2005), but not today. High resolution DSLRs are extraordinarily critical of lens performance.
If you've read this far and still not fixed your issue, go try a camera brand lens. Honest, even the $150 Nikon 18 - 55 mm kit lens works great in my tests.
Camera brand lenses say so. Don't believe camera store salesmen who claim that any off-brand is made by a camera maker, or the even funnier lie that an off-brand maker makes the lenses for the camera maker!
Nikons take Nikkors, Minoltas take Minoltas and Rokkors, Canons take Canons, Mamiyas take Mamiyas, Contax and Hasselblad take Zeiss, Pentax takes Pentax, etc.
Off brands are great for taking pictures, but not for testing the limits of modern camera performance.
If you're not happy with the sharpness of your shots, check:
1.) Are you looking at 100% on a correctly adjusted monitor?
2.) Are your expectations reasonable for a Bayer-interpolated digital camera?
3.) Did everything hold perfectly still?
4.) Did you shoot at f/8 or larger? Shoot at f/16 or smaller and you're guaranteed a softer image.
5.) Are you in perfect focus?
6.) Are you using a lens made by the same maker as your camera?
I support my growing family through this website.
If you find this as helpful as a book you might have had to buy or a workshop you may have had to take, feel free to help me continue helping everyone.
The biggest help is to use these links to Adorama, Amazon, B&H, Ritz, J&R and when you get your goodies. It costs you nothing and is a huge help to me. eBay is always a gamble, but all the other places have the best prices and service, which is why I've used them since before this website existed. I recommend them all personally.
The biggest help is to use these links to Adorama, Amazon, B&H, Ritz and J&R when you get your goodies. It costs you nothing and is a huge help. These places have the best prices and service, which is why I've used them since before this website existed. I recommend them all personally.
Thanks for reading!