Click to recommend this page:
| |
Introduction
When in the business of selling (or licensing) photos for use in print
(magazines, posters, catalogs, billboards, packaging, etc.), the very
first technical matter you'll face is deciding what resolution is
necessary to reproduce a quality image. Often, you'll hear the request:
"Send me a high-resolution image."
As you will soon learn, this tells you nothing. It's like saying to the
hair stylist, "Give me a haircut." You can imagine the response: "Okay...
How much do you want cut?" Similarly, perhaps you've heard this expression:
"I need that image at 300dpi."
Again, this is incomplete. That's like going to the grocery store
and saying, "I'd like to buy apples at $.99/pound," as you hold out
an open bag. The guy will just look at you blankly, waiting for
you to specify how many pounds of apples you want.
When it comes to both "resolution" and "DPI," the expressions are
meaningless on their own, but have become so commonplace, that it only
causes frustration when the people who actually have to use the images
(or pay for them), realize they should have been more specific. This
raises the important questions: how do you know how to be more specific?
What resolution is really necessary for any given printing need? What
do I do when I'm not sure? What do I do if I can't get or deliver the
desired resolution?
Compounding the problem is the growing trend for licensed images to be
priced on overall pixel count (or, "the size of the picture"), regardless
of the size which it may be printed, or even which media is used. This
means that if someone requests an image at 300dpi to produce a 20x30"
poster, they could pay twice as much for an image than they needed to
if it turns out that their printing device reproduces images perfectly
well as 150dpi. In essence, they could have licensed an image at half
the size, for half the price, and gotten exactly the same quality result.
Definitions
Getting down to basics, DPI stands for "dots per inch," and it refers to
the number of pixels that represents one American "Inch." (Apologies to
those countries that use the infinitely more rational metric system.)
In the digital imaging world, DPI comes up in two major contexts:
input and output. Using more familiar terms, "input" means getting image
data from a device, like a film scanner, or a digital camera; "output"
usually means sending image data to a device, like a printer. The common
word in both contexts is "device." That is, every device that deals with
digital images must understand the media that it's working with, whether
it's reading from the media, or writing to the media. Translating
between the two so the image retains its integrity is where DPI comes in.
For example, in the first case, you may scan a photo from film, and
then send the resulting image to a printer. The film may be only an
inch wide, but you want to print it on a full-sized sheet of paper; you
need to know what the DPI is on the scanner, so you can translate it to
values that make sense for the printer. This will become clear very soon.
Historical Use of "high res"
|
In the days when scanning photos to create digital images was new, if
someone asked for a film scan, they just got the highest resolution
possible because the cost of scanning any film was so high, that getting
anything less would be a waste of money. So, you would always ask for
a "high res" scan. This would, in turn, be delivered to a client, who
would then reduce it to whatever size they need. Nowadays, digital images
have greater value as their resolutions increase, which is changing
the way images are priced. So, delivering a "high res image" could
end up either costing the client a lot more than they needed to spend,
or conversely, causing the supplier to give away more than he needed to.
|
But first, don't confuse "DPI" with the term, resolution. The two
are often interchangeable in discussion, but there is a huge difference
in meaning. "Resolution" just refers to the total number of pixels in
an image. That is, it's as an absolute value. For example, one can say,
"this photo has a resolution of 5000 pixels," which means that there are
5000 pixels in the whole image. You don't know whether it's square or
rectangular, because you don't know how many pixels represent the vertical
or horizontal dimensions. You just know it's got 5000 pixels. How many
represent an inch is unknown, but it's also irrelevant; those are all
the pixels you've got, so however you spread them along a canvas is
your choice.
To make it very simple to understand, let's say we have an image
that's 5000 pixels wide and 3000 pixels high. It has a "resolution" of
5000x3000. This resolution spec has no mention of how many of those
pixels represent an inch. It's just a fixed size. When looking at the
image on film, the grains are really, really tiny, so there can be many
thousand "dots" per inch. Whereas, that same image can be shown
on a slide projector onto a wall. The further back you go, the
bigger the picture appears. What's going on is intuitive: the light
projects the dots on the wall, spreading them farther apart as the
projector backs up. The image isn't changing, but the perceived size
is. How many of those "dots" represent an inch changes as the projector
moves, even though the original image isn't changing at all.
Let's apply this concept to a real-life scenario: the image below is
represented in two contexts:
Looking at these photos, you can see how the translation occurs. The 35mm
frame of film has tiny dots, and the truck is covered with a huge sheet
of paper that has big dots (that are also spaced apart). Where "DPI"
comes is when you're talking about how many of those dots represent an
inch. On the slide, we assume we're going to scan it at the highest
resolution possible. However, the output device that prints the paper
that goes around the truck, that's what we need to know. How many "dots"
make up an inch? If we don't know, we have no idea what to tell the
printer do. If we just feed it dots without specifying a DPI, we could
end up with the image printed on only half the paper because it happened
to choose an arbitrary value. The goal would be similar to that of setting
up a projection screen on one side of the room, and a slide projector on
the other. You want to fit the entire image onto the screen, so you push
the projector back and forth till it fits. This is a way of adjusting the
DPI real-time. But, for the truck, it could be wasteful and expensive to
make a bunch of test prints with different DPI values till we have one
where the image happens to match the paper size. Instead, we can just apply
simple math: the total number of pixels divided by the total square inches
of the paper. That's the total number of "pixels per inch."
While the formula above is technically correct, there is one catch to it:
we don't know if the printer itself can actually produce a print at the
final DPI value we came up with. To discuss this, let's return to our
example image that has a resolution of 5000x3000. If you send that data
to a printer that expects images at 300 dpi, then the 5000 pixels in the
horizontal dimension will produce an image that's 15 inches wide. If the
printer is going to make wallpaper, chances are that it only prints 50
pixels per inch of paper, in which case, the same 5000 pixels will produce
100 inches of that image. If you're using the image on a billboard,
that printing device probably interprets around 5-10 dots of that image
per inch, which can yield pictures that span widths of whole buildings.
You can see this if you can find a billboard where you can get close
enough (as I did, shown in the photo here). Thus, the question of the
image printed on the truck paper is: what are the specifications for
that device? Does it print at 10-20 dpi? More? Less? The point is, we
need to provide a digital image that can have its pixels spread out
that thinly, or, like the slide projector that gets too far from the
wall, the image will just dissipate.
The whole reason for specifying a DPI value is to adjust your image to
match the output specifications to that of the output/printing device.
People who know this stuff don't just ask for a high-res image or a 300dpi
image, they ask for an image either at a specific resolution, or indicate
the final output size along with the desired DPI. For example, a company
that makes postcards might request images at 4x6" at 330dpi because they
know that most postcard printers print image optimally at that resolution.
If a European company ask for an image at size A4 at 200dpi, you can determine
that size by converting the metric system to inches. (Although Americans
have to first figure out what the heck "A4" means.) This translation
is easily done in Photoshop's "New Image" dialog.
The person who ordered the image for the truck didn't know, so I
had to call the paper manufacturer to determine what DPI his device
printed at. Turned out, it was about 24dpi. So, to calculate the size
of the image I needed for the sheet of laminent that fits on the side of
the truck, I multiply 14 (the long dimension of the truck) times 24
(the DPI of the printer), to get 4032. That's the number of pixels that
my image needs to have in the long dimension to print properly on this
truck. Assuming the image is 4032 in the long dimension, and assuming
I'm using an image shot with a standard 35mm camera (film or digital),
then the aspect ratio is 3:2. That is, for every three pixels in the long
dimension, there are two pixels in the short dimension. This means that
the total dimensions of this image must be 4032x3024, which is about a
12 megabyte file.
The alert reader may notice that a 14x6 foot truck isn't the same aspect
ratio as a 3:2 photo, so some of the picture is cropped at the top and
bottom in order to fit onto the paper.
Scanned Film and DPI
Now, let's apply what we've learned to a scanned filmstrip. In this case,
DPI refers to the input device. In this case, that's a scanner. Consider
what you'd get if you scanned a frame of 35mm film at 300dpi, and
then printed that image on your home printer. Here, when you scan
film at a given resolution, you're picking up data from the medium
at that resolution. Doing the math, a 35mm frame of film is 0.9x1.6",
so a 300dpi scan yields a 270x340 pixel image. If you print that final
image on a home printer and don't change the DPI value, you would get a
picture that's .9" by 1.6", which happens to be exactly the same size
as the film. This experiment always surprises people, who expected
an enlargement. Given what we've learned, you know why this happened:
the input DPI was identical to the output DPI, which means that there
was no magnification done at all. It's a pixel-for-pixel reproduction.
"PPI" versus "DPI"
|
A related problem that you may run into if you scan film is a similarly
defined term, "PPI," which means "pixels per inch." There are some
extremely obscure and obsolete historical references to how DPI and
PPI are different, but even taking those into account, there is no
real difference. Suffice to say, the colloquial intentions of both are
identical. However, you may see references to PPI when using scanners,
which is where the basis for the term originated. If anything, the two
terms can potentially be helpful to differentiate between the resolution
in which you want an image scanned versus the resolution you want it
delivered (to printer, e.g.). While rare, you may hear someone say,
"scan the film at 4000ppi, and deliver it to me at 300dpi." Here, you
can easily see there are two different measurement systems going on,
whereas if "dpi" were used in both cases, novices may get confused.
|
One thing you could do is change the DPI value on the scanner from 300dpi
to, say, 3000dpi. Doing the math, 3000 is ten times more than 300, so
you'll end up with ten times the total number of pixels from the first
scan. Accordingly, it seems like it would be ten times bigger when you
print it on paper. Will it be?
It depends on two things: first, whether you tell your printer to
ignore whatever DPI value is embedded in the image and print at the
value you tell it to. Second, whether you changed the DPI value on
the image itself, before you sent it to the printer. You can do the
first option if you use software that supports the feature, such as
Adobe Photoshop (see the "Print Preview" screen). Most people overlook
this (or don't have it available in their software), so they are subject
to the often unpredictable results from the second condition: what the
embedded DPI value is in the image. Because you scanned at 3000dpi,
the image will still have that "3000dpi" value embedded in its data
header. Hence, the generic printer will comply, "Ok, I'll interpret
the image so that every 3000 pixels represents one inch," which will
give you exactly the same result: a small print.
So, the thing to do is change the DPI value after the scan, to the
desired value so your printer will interpret it as desired. Choosing
300dpi, you'll get an image that's 9x16 and look pretty darn good. But it
will also likely exceed to size of your paper. To reduce the image to
a size that fits on a page, you can do one of several things:
-
Change the DPI value to a new value where the printer will fit it
onto an 8½x11 sheet of paper.
-
Resize the image (in Photoshop, for example) to reduce the total pixel
count ("resolution") to a size where 300dpi fits onto the paper.
-
Rescan the image at a resolution that produces exactly the number of
pixels you need.
While you can do any of these, let's get back to the practicality of
having a photo business. The first option of changing the DPI value is
not desirable because you're just throwing more data at the printer than
it needs, which it will duly throw away, and you don't want the dumb
printer to decide what pixels it should keep and which it shouldn't.
What's more, you don't want to give more image data to a client than
they paid for; it's like giving someone five quarters when they ask for
change for a dollar. You're just giving away money. As for the third
option of rescanning the photo to fit the desired size, that can work,
but then you have some practical considerations: as an image service
provider, you don't want to have to rescan film whenever someone asks
for it in a different size. This not only wastes time and effort, but
you have to re-calibrate that digital image to be visually identical to
the picture they saw. This is not only difficult, but infeasible. You
always want to scan pictures at the highest resolution possible, save
a master copy of the image to disk, and then make a copy with the size
adjusted as-needed for any given client.
If you think this is confusing, this is exactly the kind of headache
you're going to give someone else if you deliver them the wrong sized
image. And that will happen if you don't get the full and appropriate
specifications from them in the first place.
Tricky and Sticky
Where things get tricky and sticky, is determining the real DPI values
for any given device. There is no universal DPI that's consistent across
all printing platforms, just as there is no one universal hair length
for, "give me a haircut." Nor is there a preset number of apples get you
if you ask for them at $.99/lb. Although DPI values vary dramatically,
most people who request images have no idea what their device actually
needs, but "300dpi" is the de-facto standard for historical reasons.
Since most printing systems can commonly print consistently adequate
results at 300dpi, there's not a huge incentive to fight the system.
So, when a graphic designer insists that you provide an image at 300dpi,
even though they don't really need it, it's sometimes best to let them
have it. (You still need to know the final output dimensions in inches
though!)
The exceptions to this come in when costs are involved (the client wants
to spend less money), or it's physically impossible to meet the specs.
We'll illustrate these scenarios in the next section.
Q&A Exercises
Let's test your knowledge of the information we've covered:
You've sold a photo to a magazine who wants to use it in an
article. They ask you for a digital image that's 300 dpi.
What do you give them?
There is no correct answer to this until you know what their final
desired output size is. If they say they're going to print a quarter-page,
you still don't know. You need a specific size in some measurement,
preferably inches, so you can apply the "300dpi" part. (If you get mm
or cm, you have to convert to inches first.) Assuming they're requesting
the image to be 9" high, you now know its vertical resolution:
9 X 300 = 2700.
Do they really need an image at 300dpi? How do you know it's not 200?
As discussed earlier, the claim that they need an image at 300
dpi is based on the possibly-false assumption that their output device
actually needs image data at 300dpi. Chances are, this isn't the case. As
image printing has evolved, the ability to print quality images from
lower-resolution files has improved considerably. If your client is wary
of the license fee, offer the image at a reduced resolution for a reduced
fee. If you can convince them that a 200dpi file is likely to print as
well as a 300dpi file, but it costs 1/3 less, you may both come out ahead.
A toy company wants to use an image for the cover of a game box
that's 30x30". They say that want an image at 300dpi. What do you give them?
Doing quick calculations: that's a total resolution of 9000x9000.
You're in a pickle, because you can't deliver a digital image that big
unless you originally shot it with a large-format camera. Neither 35mm
film or digital cameras (currently) can produce a digital image that
big. You might think you can use a very high-resolution drum scanner to
read your 35mm film and give a high-res file, but that will only result
in a lot of noise because the film's grain isn't that small. It'll read
between the grains, yielding a digital image that's nearly useless.
(Most 35mm film can't be scanned higher than 4000dpi before yielding
more noise than actual image data.)
The "correct" thing to do in this case is to explain that the highest
resolution file you have is likely to be sufficient, and that they should
run a test print to check. If you can't get them to do that, or if the
test print still yields a coarse image due to pixilation, an alternative
is use software (like Photoshop or Genuine Fractals) to "interpolate
up" the image. That is, you can add pixels based on imaging algorithms
that can approximate what will keep the integrity of the picture while
increasing its overall resolution. There are limits to how far this
technique will work, but the good news is that if you're dealing with a
client where those exceptions would apply, they'll already know, and
know how to correct the problem on their own.
On the other hand, maybe not. I had one client that was so worried that
the 300dpi file that I gave her would be insufficient for a print job
that she insisted would only work with a 600 dpi file. I tried to tell
her that the image would work just fine given the final output size she
needed, but I was unable to convince her. I finally just broke down
and said I'd send her a new file, but because it was literally impossible
to make such a high-res image, I merely brought the same image into
Photoshop, changed the DPI value from 300 to 600, leaving the entire
image data alone, and sent it back to her. The image came out as expected,
and she was perfectly happy with the results.
Summary
Obviously, you need to deliver an image that's right for the use. Giving
too much data or too little only makes things harder for everyone. What's
more, if the price of the license is based on image resolution (which is
the convention today), it's in everyone's best interests to determine
what the real DPI needs are so the final resolution can be determined.
In most practical terms, you're rarely going to convince a photo editor
that they don't really need a 300dpi file, but you must at least establish
what the final output size is before you can deliver anything.
Digital cameras have similar considerations because they also capture
images in multiple resolutions, but because they have upper-bound limits,
you may have limits on what you can license. Digital cameras don't
measure images in terms of "dots per inch," they just capture a fixed
number of pixels for any given picture. While you can set your resolution
to various sizes, you should set your camera like you'd scan a photo:
capture the highest resolution possible to optimize your image quality.
An eight-megapixel camera has a total resolution of about 3520x2344;
not as much resolution as 35mm when scanned at its maximum setting,
but sufficient for many commercial needs. They can also produce fine
art prints up to 14x20, given sufficiently artful image editing skills.
As digital cameras evolve, resolutions will get bigger, making most of
these issues less of a concern. (The issue will never go away, but the
anxiety about it will diminish.)
Click to recommend this page:
|
|