by Chris Peterson » Sun Feb 11, 2024 6:05 pm
johnnydeep wrote: ↑Sun Feb 11, 2024 5:38 pm
Chris Peterson wrote: ↑Sun Feb 11, 2024 3:50 pm
johnnydeep wrote: ↑Mon Feb 05, 2024 7:41 pm
I believe HOS is Hydrogen Oxygen and Sulfur (filters?), and Foraxx is apparently a "palate" available in Pixinsight. See this for example:
More generally, all images are output as a ratio of red, green, and blue because those are the native pixel colors on our displays. "True color" images are also
collected through broad red, green, and blue filters, which approximately match our displays. So those are often called RGB images.
Science images (and also ones with a specific aesthetic) are often collected through narrowband (just a few nanometers) filters that are centered on the emission of specific elements. By far the most common filters (and largely the only ones used by amateurs) are H-alpha (singly ionized hydrogen, H II, 656 nm), [O III] (doubly ionized oxygen, 501 nm), and [ S II] (singly ionized sulfur, 672 nm). The tricky bit here is that both the H and S filters are passing deep red light, so there are different ways of deciding what colors they will be assigned to for display.
Common nomenclature is to list the filters in RGB order, so
HOO has H assigned to red, O assigned to green, O assigned to blue (which actually means O is displayed as cyan, G+B).
SHO has S assigned to red, H assigned to green, O assigned to blue. This is usually called the "Hubble palette".
HSO has H assigned to red, S assigned to green, O assigned to blue. Slightly more natural as hydrogen (red) is usually the dominant emission.
HOS has H assigned to red, O assigned to green, S assigned to blue. That's what is used with this image, and likely leads to the closest "true color" representation possible with these three filters.
Beyond this, there are different ways of balancing the weights of the RGB channels to get different visual appearances, and the Foraxx reference is to one of these techniques (but the raw data is still HOS).
And there are even more complex ways of doing things, where individual input channels are mixed across multiple output channels (as with the HOO case above). And sometimes broad and narrow filter data are combined, as well. Very often we'll see the RGB data used to create natural colored stars in an image that is otherwise narrowband (which usually produces weird star colors, like the magenta ones commonly seen in Hubble images).
Thanks! There's a lot of info to unpack in those paragraphs. Much appreciated.
Now, a more basic question: I realized that I don't really know how a filter works. So, light is collected by a some lens and mirror combination, and eventually the light/photons hit a CCD detector array. At what point does a filter come into play? I think I've seen amateur telescopes with filters in front of the objective lens of a refractor. But that clearly can't be how filters work on Hubble since that would require massive filters and some large mechanical means of putting them in place. So, do the filters get interposed somehow right before the light hits the CCD array?
And finally, are filters just some sort of tinted film on the surface of (or embedded within) a thin piece of glass, or can they be changed electronically? (I'm thinking of those electrical window shading glass panels in high-end houses and office buildings.)
The only filters that people place at the telescope aperture are solar filters, because you have to attenuate the light before it gets focused or you'll melt stuff inside the telescope! And those are just aluminized glass or plastic that passes all wavelengths. Pretty much the same material that eclipse glasses are made of.
Otherwise, filters are always in the optical path in front of the camera. Usually the last optics before the sensor (other than windows in the camera body itself and on the sensor chip). They tend to be quite expensive, so you want to use as small of ones as possible... normally just a little larger than the sensor diagonal.
Broadband filters (for amateurs that means RGB) have widths around 100 nm, and are usually dye-based. Basically, colored glass. They are relatively inexpensive. Narrowband filters are usually only 3-6 nm wide, and you can't do that with a colored dye. Those are interference filters, that are made by depositing many thin layers of optical material having different indices of refraction, spaced such that only one wavelength is constructively interefered with, and the rest destructively. This is combined with a broader colored filter to isolate just the wavelength of interest and not multiples of it. You can imagine that the process involved in making such filters is tricky... and the prices reflect that. (A good one can easily set you back $300.)
There are no electrically tunable filters like you describe (but some narrowband filters, in particular those designed to look at solar emissions, can be electrically tuned over a very narrow range. Those filters have widths less than 1 nm and cost thousands of dollars.)
[quote=johnnydeep post_id=337059 time=1707673139 user_id=132061]
[quote="Chris Peterson" post_id=337057 time=1707666605 user_id=117706]
[quote=johnnydeep post_id=336797 time=1707162099 user_id=132061]
I believe HOS is Hydrogen Oxygen and Sulfur (filters?), and Foraxx is apparently a "palate" available in Pixinsight. See this for example:
[/quote]
More generally, all images are output as a ratio of red, green, and blue because those are the native pixel colors on our displays. "True color" images are also [i]collected [/i]through broad red, green, and blue filters, which approximately match our displays. So those are often called RGB images.
Science images (and also ones with a specific aesthetic) are often collected through narrowband (just a few nanometers) filters that are centered on the emission of specific elements. By far the most common filters (and largely the only ones used by amateurs) are H-alpha (singly ionized hydrogen, H II, 656 nm), [O III] (doubly ionized oxygen, 501 nm), and [ S II] (singly ionized sulfur, 672 nm). The tricky bit here is that both the H and S filters are passing deep red light, so there are different ways of deciding what colors they will be assigned to for display.
Common nomenclature is to list the filters in RGB order, so
HOO has H assigned to red, O assigned to green, O assigned to blue (which actually means O is displayed as cyan, G+B).
SHO has S assigned to red, H assigned to green, O assigned to blue. This is usually called the "Hubble palette".
HSO has H assigned to red, S assigned to green, O assigned to blue. Slightly more natural as hydrogen (red) is usually the dominant emission.
HOS has H assigned to red, O assigned to green, S assigned to blue. That's what is used with this image, and likely leads to the closest "true color" representation possible with these three filters.
Beyond this, there are different ways of balancing the weights of the RGB channels to get different visual appearances, and the Foraxx reference is to one of these techniques (but the raw data is still HOS).
And there are even more complex ways of doing things, where individual input channels are mixed across multiple output channels (as with the HOO case above). And sometimes broad and narrow filter data are combined, as well. Very often we'll see the RGB data used to create natural colored stars in an image that is otherwise narrowband (which usually produces weird star colors, like the magenta ones commonly seen in Hubble images).
[/quote]
Thanks! There's a lot of info to unpack in those paragraphs. Much appreciated.
Now, a more basic question: I realized that I don't really know how a filter works. So, light is collected by a some lens and mirror combination, and eventually the light/photons hit a CCD detector array. At what point does a filter come into play? I think I've seen amateur telescopes with filters in front of the objective lens of a refractor. But that clearly can't be how filters work on Hubble since that would require massive filters and some large mechanical means of putting them in place. So, do the filters get interposed somehow right before the light hits the CCD array?
And finally, are filters just some sort of tinted film on the surface of (or embedded within) a thin piece of glass, or can they be changed electronically? (I'm thinking of those electrical window shading glass panels in high-end houses and office buildings.)
[/quote]
The only filters that people place at the telescope aperture are solar filters, because you have to attenuate the light before it gets focused or you'll melt stuff inside the telescope! And those are just aluminized glass or plastic that passes all wavelengths. Pretty much the same material that eclipse glasses are made of.
Otherwise, filters are always in the optical path in front of the camera. Usually the last optics before the sensor (other than windows in the camera body itself and on the sensor chip). They tend to be quite expensive, so you want to use as small of ones as possible... normally just a little larger than the sensor diagonal.
Broadband filters (for amateurs that means RGB) have widths around 100 nm, and are usually dye-based. Basically, colored glass. They are relatively inexpensive. Narrowband filters are usually only 3-6 nm wide, and you can't do that with a colored dye. Those are interference filters, that are made by depositing many thin layers of optical material having different indices of refraction, spaced such that only one wavelength is constructively interefered with, and the rest destructively. This is combined with a broader colored filter to isolate just the wavelength of interest and not multiples of it. You can imagine that the process involved in making such filters is tricky... and the prices reflect that. (A good one can easily set you back $300.)
There are no electrically tunable filters like you describe (but some narrowband filters, in particular those designed to look at solar emissions, can be electrically tuned over a very narrow range. Those filters have widths less than 1 nm and cost thousands of dollars.)