by jgabany » Tue Aug 14, 2007 12:37 am
ldecola wrote:I see lots of images in full color but wonder what the 'true' colors of the objects might be? I.e. what would they look like to me say if I were visiting them?
Hello Idecola:
This is a great question!
The human eye's retina contains two types of photoreceptors called rods and cones. There are about 120 million rods compared to approximately 7 million cones. Rods are more sensitive to light but only cones detect color. This is why we can make out objects that surround us, in dimly lit situations, but we cannot discern their hue. Light is comprised of three primary colors, red, blue and green. Of these, the cones in our eyes are most sensitive to the later, which makes some evolutionary sense if your ancestor's survival was dependant upon discerning plants.
Astronomical telescopes are essentially used for two purposes: 1) to help separate distant but closely spaced objects and 2) to collect a lot of light. The amount of light collected by even the world's largest telescopes is still insufficient for the cones in our eyes to detect color in faint nebula and galaxies other than green. Therefore, the full color of distant astronomical places, other than stars and planets, is something that still eludes direct observation. It should be noted, however, that there have been some rare claims of seeing other colors by a few observers who may simply have eyes with more color sensitivity.
But film and digital cameras do not have this type of color bias. Film emulsion contains crystals that are sensitive to each of the three primary colors of light and color digital cameras place microscopic red, green or blue filters on top of their pixels. Manufacturers use various schemes to place these filters, it should be noted, but here's the point: only a portion of the pixels in any color digital camera are dedicated to one color. Regardless, this enables cameras to detect color much more efficiently than human eyes. Digital astronomical cameras go one-step further- they use every pixel for each color.
Cameras specifically designed for taking deep space images are unsurpassed for detecting very faint light but they only produce results in black and white. To create a full color picture, astronomers, both professional and amateur, place a red, green or blue filter in front of the camera so that every pixel is limited to detecting one specific color reflecting or shining from the astro-subject. This, by the way, is a very time consuming process. To create a full color picture, the astronomer digitally combines separate red, green and blues images using commercially available software like Photoshop. Thus, the colors seen in deep space objects taken through a camera are very real and, unless mis-handled during processing, they are also accurate.
The picture that is on APOD today was produced using a total of over 8 hours of exposure. This amount of time enabled the light to be accumulated by the camera's pixels. However, if you were to travel close to this, or most astronomical places, the scene would look different. Most likely all you would see are stars and maybe hints of the nebulosity that has a teal hue in this picture. The rest would be essentially invisible- particularly the red colored areas because their material is very thin and our eyes are not red-sensitive.
The image comparison that Case included with his post does a good job of approximating how this particular scene would appear to our eyes at close distance. The only difference, I would propose, is that all hint of the red color still visible in his right panel would actually appear much less prominent and that which could be seen would be colored greenish-gray.
Jay
[quote="ldecola"]I see lots of images in full color but wonder what the 'true' colors of the objects might be? I.e. what would they look like to me say if I were visiting them?[/quote]
Hello Idecola:
This is a great question!
The human eye's retina contains two types of photoreceptors called rods and cones. There are about 120 million rods compared to approximately 7 million cones. Rods are more sensitive to light but only cones detect color. This is why we can make out objects that surround us, in dimly lit situations, but we cannot discern their hue. Light is comprised of three primary colors, red, blue and green. Of these, the cones in our eyes are most sensitive to the later, which makes some evolutionary sense if your ancestor's survival was dependant upon discerning plants.
Astronomical telescopes are essentially used for two purposes: 1) to help separate distant but closely spaced objects and 2) to collect a lot of light. The amount of light collected by even the world's largest telescopes is still insufficient for the cones in our eyes to detect color in faint nebula and galaxies other than green. Therefore, the full color of distant astronomical places, other than stars and planets, is something that still eludes direct observation. It should be noted, however, that there have been some rare claims of seeing other colors by a few observers who may simply have eyes with more color sensitivity.
But film and digital cameras do not have this type of color bias. Film emulsion contains crystals that are sensitive to each of the three primary colors of light and color digital cameras place microscopic red, green or blue filters on top of their pixels. Manufacturers use various schemes to place these filters, it should be noted, but here's the point: only a portion of the pixels in any color digital camera are dedicated to one color. Regardless, this enables cameras to detect color much more efficiently than human eyes. Digital astronomical cameras go one-step further- they use every pixel for each color.
Cameras specifically designed for taking deep space images are unsurpassed for detecting very faint light but they only produce results in black and white. To create a full color picture, astronomers, both professional and amateur, place a red, green or blue filter in front of the camera so that every pixel is limited to detecting one specific color reflecting or shining from the astro-subject. This, by the way, is a very time consuming process. To create a full color picture, the astronomer digitally combines separate red, green and blues images using commercially available software like Photoshop. Thus, the colors seen in deep space objects taken through a camera are very real and, unless mis-handled during processing, they are also accurate.
The picture that is on APOD today was produced using a total of over 8 hours of exposure. This amount of time enabled the light to be accumulated by the camera's pixels. However, if you were to travel close to this, or most astronomical places, the scene would look different. Most likely all you would see are stars and maybe hints of the nebulosity that has a teal hue in this picture. The rest would be essentially invisible- particularly the red colored areas because their material is very thin and our eyes are not red-sensitive.
The image comparison that Case included with his post does a good job of approximating how this particular scene would appear to our eyes at close distance. The only difference, I would propose, is that all hint of the red color still visible in his right panel would actually appear much less prominent and that which could be seen would be colored greenish-gray.
Jay