There are multiple ways to interpret a number as a speed: “65” in miles/hour is highway cruising speed, but “65” in knots on the highway is a speeding ticket, while “65” in kilometers/hour is only half that speed. “65” in meters/second is a category-4 hurricane, and “65” in Mach is faster than a meteor.
Clearly, when discussing speed, it's important to know not only the raw number, but also the scale in which to interpret it.
When it comes to representing color, the digital-image version of this kind of scale is its color space. A digital-image file is made up of raw numeric data — data that is not “color,” but numbers representing color. There are actually many different ways to represent color with numbers in an image file; if an application processing an image file doesn't know which scale was used in creating the image's raw numeric color data (and if it isn't able to guess correctly), it doesn't know how to properly recreate the color when printing or displaying the image. The result is an image with wrong colors, like a TV whose “tint” setting is way out of wack.
Color Data Properly Interpreted
Color Data Misinterpreted
So, a color space is a set of parameters that describe how to convert between numbers and real-world colors. Different color spaces mean different conversions: different real-world colors from the same number, or different numbers from the same real-world color. The technical details are discussed later in the article; for the moment, just consider them abstractly.
To be clear, the concept of a color space is different from a file format. Many image file formats (such as JPEG, TIFF, DNG, etc.) have the ability to use different color spaces — different ways to encode the image color — for their raw numeric data. You can think of a file format as being a standard for how to arrange image data (“such-and-such metadata is allowed, the image is composed of lines of pixels arranged this-and-that way, using such-and-such compression and interleaving....”) and a color space describes how to interpret the per-pixel data into light.
Using music files as an analogy, the “sound space” would be a set of parameters such as the minimum and maximum frequency that can be encoded, the range of amplitudes (volume) that can be encoded, and the mathematical parameters for how the smooth range of encodable real-world frequencies and amplitudes are converted to discrete, raw numeric data.
The most commonly-used color space is called “sRGB,” which has one overwhelmingly important characteristic: it's the most commonly-used color space. Just about every scanner and digital camera can produce images with sRGB-encoded color. Just about every image-handling device (like printer) and color-aware application (photo editor) can handle images with sRGB-encoded color. In fact, it's the de facto default color space for image input and output of most devices and applications (at least those that understand color spaces). It's the official default color space for the World Wide Web.
The ubiquity of the sRGB color space is really quite convenient.
Because things that produce images (digital cameras, scanners, image-editing software, ....) are on the same color-space wavelength, so to speak, as many things that process or display images (printers, image-viewing software, ...), colors are generally encoded and decoded reasonably well. What you see is not only what you get, but what you're supposed to get.
So, in the face of this overwhelming ubiquity, why would anyone even bother using a different color space for a digital image? (That is, a different method to encode colors with numbers within the raw image data?)
For technical reasons discussed later in this article, the design of a color space necessarily involves tradeoffs among various aspects of image quality. For example, it might be surprising to learn that it's impossible for a color space to represent all the colors that a human eye can discern, and so some subtle shades must necessarily be omitted (if an image's true color is one of the omitted shades, the encoded color becomes one that's close — usually very, very close — to the true color). So, which shades are included and excluded is one aspect that makes a color space more or less appealing to certain users or artistic tastes.
The next page of this article offers a few more technical details about color spaces and their relative merits, but the main point of this article is merely to introduce the concept of color spaces — ways to convert colors to and from raw numeric data — and why it's important for the photographer to know about them.
When a different color space is used to encode the raw image data, the resulting data is still just a bunch of raw numbers, so how does a printer or application know that it should consider it in the light of something other than sRGB (or whatever its default color space is)? The answer is usually found in the form of an embedded color profile.
A color profile for a color space is the aforementioned set of parameters that describe the color space, arranged in a standardized way so that they can be communicated along with an image. It can be embedded within a digital image file as metadata, along the same lines as how the date and time are included with the image. This conveniently allows a printer, image-display software, or other color-aware device/application that receives the image file to know the particulars of the color space so that it can properly decode the colors. (If an image doesn't have an embedded color profile, most devices go ahead and decode the colors using the parameters of the sRGB color-space.)
Without the proper color profile associated with the color space used to create the image data, applications don't know how to decode the color data. This results in mixed up or “off” colors, like the “color data misinterpreted” image shown above. That example, by the way, is not a simulation, but an image that looks perfectly normal when interpreted using the proper color space. To ensure that you'd see the wrong colors, I manually removed the embedded color profile from the image file, knowing that your browser would then misinterpret the color data.
You might be surprised to find out what happens when I do go ahead and include the proper color profile along with the image, as I do with the middle image here:
Correct or Wrong?
|(For your reference, here is the color profile for the color space used for the center and right-side images.)|
Does the middle image look okay or wrong? It depends on how colormetrically-enlightened your browser is:
Middle Image Looks Okay: your browser recognizes and respects an image's embedded color profile. Congratulations!
Middle Image Looks Wrong: your browser does not recognize or respect an image's embedded color profile. How unsociable!
And herein lies the problem: many browsers are not “color managed” applications, meaning that they do not know how to recognize or understand color profiles.
We'll talk much more about color management later, but first let's look at just how wrong the colors can be in a real-world situation. When creating the sample images shown above, I purposefully used a color space wildly different from sRGB — one that I made up just for this purpose — so that the “wrongness” would be exaggerated and obviously apparent at first glance.
The next page of this article shows a variety of photos encoded with real-world color spaces (color spaces named AdobeRGB, ColorMatch, ProPhoto, WideRGB, and AppleRGB) but without an embedded profile, allowing you to see real-world effects of misinterpreted color.
Continued on the Next Page
This article continues on Page 2: Test Images.