by jgabany » Thu Jul 12, 2007 6:42 am
Hello bystander:
When you gaze towards a clear moonless night sky, the stars appear as points of light - most are colorless. There are a few exceptions, however: Mars, Aldebaran and the star at the heart of the constellation Scorpius, Antares, can be seen to have a very slight reddish hue. Through a small telescope, star and planetary colors become more apparent but galaxies and nebulas remain un-pigmented and monochromatic. These objects begin to take on a greenish ting when viewed through very large telescopes but rarely show the rainbow of hues seen in many deep space pictures.
This begs the question that is often asked of astrophotographers: are those the real colors or did you make them up?
The human eye's retina contains two types of
photoreceptors called rods and cones. There are about 120 million rods compared to approximately 7 million cones. Rods are more sensitive to light but only cones detect color. This is why we can make out objects that surround us, in dimly lit situations, but we cannot discern their hue. Light is comprised of
three primary colors, red, blue and green. Of these, the cones in our eyes are most sensitive to the later, which makes some evolutionary sense if your ancestor's survival was dependant upon discerning plants (plus, our Sun's light peaks in green wavelengths).
Astronomical telescopes are essentially used for two purposes: 1) to help separate distant but closely spaced objects and 2) to collect a lot of light. The amount of light collected by even the world's largest telescopes is still insufficient for the cones in our eyes to detect color in faint nebula and galaxies other than green. Therefore, the full color of distant astronomical places, other than stars and planets, is something that still eludes direct observation. It should be noted, however, that there have been some rare claims of seeing other colors by a few observers who may simply have eyes with more color sensitivity.
But film and digital cameras do not have this type of color bias. Film emulsion contains crystals that are sensitive to each of the three primary colors of light. Color digital cameras place microscopic red, green or blue
filters on top of their grayscale sensing pixels then colorize them inside the camera. Manufacturers use various schemes to place these filters, it should be noted, but here's the point: only a portion of the pixels (in any color digital camera) are dedicated to one color. Regardless, this enables cameras to detect color much more efficiently than human eyes.
Digital
astronomical cameras go one-step further- they use every pixel for each color.
Cameras specifically designed for taking deep space images are unsurpassed for detecting very faint light but they only produce results in black and white. To create a full color picture, astronomers, both professional and amateur, place a red, green or blue filter in front of the camera so that every pixel is limited to detecting one specific color reflecting or shining from the astro-subject. This, by the way, is a very time consuming process. To create a full color picture, the astronomer digitally
combines separate red, green and blues images using commercially available software like Photoshop. Thus, the colors seen in deep space objects taken through a camera are very real and, unless mis-handled during processing, they are also accurate.
Interestingly, false color images also are created by placing a filter in front of the astronomical camera. However, these filters are not broadly tuned to a specific color- they only pass the glow emitted by specific ionized atoms. These are usually the wavelengths associated with a particular element such as hydrogen, sulfur or oxygen- these are just examples, there are many others. Astronomers will assign a color to each of the filtered images produced using this technique so that color can be used as a method of visulizing not only how the astronomical subject physically appears but
what it is chemically made of. For example, hydrogen atoms are often tinted green, sulfur is colored red and the hue for oxygen is blue. Of course, there is nothing to prevent different color assignments.
I apologize for the length of this post but it seems that there is a lot of curiosity about how astro images are produced and I thought this might help...
Jay
[quote="bystander"]It is my understanding that due to the long exposure times required, b&w imaging provides the best contrast and relative exposure values. See an explanation of the process:
[url]http://www.noao.edu/outreach/aop/glossary/lrgb.html[/url][/quote]
Hello bystander:
When you gaze towards a clear moonless night sky, the stars appear as points of light - most are colorless. There are a few exceptions, however: Mars, Aldebaran and the star at the heart of the constellation Scorpius, Antares, can be seen to have a very slight reddish hue. Through a small telescope, star and planetary colors become more apparent but galaxies and nebulas remain un-pigmented and monochromatic. These objects begin to take on a greenish ting when viewed through very large telescopes but rarely show the rainbow of hues seen in many deep space pictures.
This begs the question that is often asked of astrophotographers: are those the real colors or did you make them up?
The human eye's retina contains two types of [url=http://www.hhmi.org/senses/b110.html][b]photoreceptors[/b][/url] called rods and cones. There are about 120 million rods compared to approximately 7 million cones. Rods are more sensitive to light but only cones detect color. This is why we can make out objects that surround us, in dimly lit situations, but we cannot discern their hue. Light is comprised of [url=http://www.fas.harvard.edu/~scdiroff/lds/LightOptics/ColorMixing/ColorMixing06.jpg][b]three primary colors[/b][/url], red, blue and green. Of these, the cones in our eyes are most sensitive to the later, which makes some evolutionary sense if your ancestor's survival was dependant upon discerning plants (plus, our Sun's light peaks in green wavelengths).
Astronomical telescopes are essentially used for two purposes: 1) to help separate distant but closely spaced objects and 2) to collect a lot of light. The amount of light collected by even the world's largest telescopes is still insufficient for the cones in our eyes to detect color in faint nebula and galaxies other than green. Therefore, the full color of distant astronomical places, other than stars and planets, is something that still eludes direct observation. It should be noted, however, that there have been some rare claims of seeing other colors by a few observers who may simply have eyes with more color sensitivity.
But film and digital cameras do not have this type of color bias. Film emulsion contains crystals that are sensitive to each of the three primary colors of light. Color digital cameras place microscopic red, green or blue [url=http://www.shortcourses.com/choosing/how/03.htm#From%20black%20and%20white%20to%20color][b]filters[/b][/url] on top of their grayscale sensing pixels then colorize them inside the camera. Manufacturers use various schemes to place these filters, it should be noted, but here's the point: only a portion of the pixels (in any color digital camera) are dedicated to one color. Regardless, this enables cameras to detect color much more efficiently than human eyes.
Digital [i]astronomical[/i] cameras go one-step further- they use every pixel for each color.
Cameras specifically designed for taking deep space images are unsurpassed for detecting very faint light but they only produce results in black and white. To create a full color picture, astronomers, both professional and amateur, place a red, green or blue filter in front of the camera so that every pixel is limited to detecting one specific color reflecting or shining from the astro-subject. This, by the way, is a very time consuming process. To create a full color picture, the astronomer digitally [url=http://hubblesite.org/sci.d.tech/behind_the_pictures/meaning_of_color/rgb.shtml][b]combines[/b][/url] separate red, green and blues images using commercially available software like Photoshop. Thus, the colors seen in deep space objects taken through a camera are very real and, unless mis-handled during processing, they are also accurate.
Interestingly, false color images also are created by placing a filter in front of the astronomical camera. However, these filters are not broadly tuned to a specific color- they only pass the glow emitted by specific ionized atoms. These are usually the wavelengths associated with a particular element such as hydrogen, sulfur or oxygen- these are just examples, there are many others. Astronomers will assign a color to each of the filtered images produced using this technique so that color can be used as a method of visulizing not only how the astronomical subject physically appears but [i]what[/i] it is chemically made of. For example, hydrogen atoms are often tinted green, sulfur is colored red and the hue for oxygen is blue. Of course, there is nothing to prevent different color assignments.
I apologize for the length of this post but it seems that there is a lot of curiosity about how astro images are produced and I thought this might help...
Jay