by Ann » Tue Apr 04, 2023 6:58 am
zendae wrote: ↑Mon Apr 03, 2023 5:14 am
Thank you Ann. It almost sounds like a giant centrifuge of sorts - not working exactly the same way, but having similar results.
Well, we have a limited number of wavelength colors, though many shades. Does each element correspond to a color shade? Can we have 98 shades of color providing the elements are glowing? Do some (or many) never glow? Can we detect non-glowing ones and still ascribe a correct color to them or is that impossible?
For instance, in my a/v field, we cannot have a "brown" light gel expecting to throw "brown" light onto a stage, because isn't "brown" merely very low amplitude yellow? Shine a bright white light thru a brown-colored gel you kinda get yellow. (I am way more audio than visual so I know I'm guessing a bit here). But I am curious if we could possibly see a really colorful nebula if the time were taken to put the thing through lotsa frequencies. I hope I'm making sense here...
Zendae, you wrote:
Well, we have a limited number of wavelength colors, though many shades.
I'd say we have many many (visible) wavelengths, but our eyes, our cameras, our filters and our processing techniques limit what we can see.
There are huge numbers of wavelengths in the Sun's visible spectrum. It would have been possible, certainly theoretically possible, to create an even more detailed solar spectrum showing even more (visible) wavelengths of the Sun than are seen in the picture above.
In the spectrum, you can see that there are huge numbers of little dark lines showing up everwhere. These are absorption lines, corresponding to wavelengths that have been absorbed (blackened out) by elements in the outer solar atmosphere. However, it is possible (under certain circumstances) to make the elements that "stole" these wavelengths "give them back" again:
So you could, theoretically, have colored light glowing from all the positions of the black lines in the solar spectrum.
Let's talk about photography. When it comes to photography, there are so many factors that limit what can be seen. Since I am a non-photographer, you shouldn't ask me too many questions about it.
Nevertheless, I should be able to say something about the photographic film or filters used to detect color. Take a look at two pictures that I took yesterday with my cheap little mobile phone:
The wood anemone (top) looks blue.
How weird.
True-color white wood anemone.
The picture at left shows two flowers, a wood anemone and a yellow star of Bethlehem. The wood anemone looks blue. The picture at right shows what the wood anemone really looks like: It is white. But the camera in my mobile phone decided to make the wood anemone look blue in the first picture (where the flower is in the shade), and I had no say in the matter.
My mobile phone camera uses no filters, and the picture is taken "all at once". But you can also take filter images. That means that you photograph the same object through different filters that are sensitive to different wavelengths. Take a look at the Hubble Heritage "filter images" of galaxy NGC 4650A:
From these "filter images" you create a composite color picture. Here you can see some suggested images and the one that was actually chosen:
The filters used for the portrait of NGC 4650 were all wideband, i.e, all three filters were sensitive to a wide range of wavelengths centered on 814 nm, 606 nm and 450 nm, respectively. The final image depended on the combination of the sensitivity of the three filters at different wavelengths (and on the people working with the filter images and their abilities and aesthetic choices).
But then there are narrowband images:
The iconic Pillars of Creation image from 1995 is Hubble's first narrowband image. The filters used were 502 nm,
███, for doubly ionized oxygen, 657 nm (Hα + [N II])
███, for ionized hydrogen and nitrogen, and 673N (S II)
███, for ionized sulfur.
As you can see, ionized hydrogen, ionized nitrogen and ionized sulfur look identical to the human eye, and it is impossible to show the difference between these three ionized elements in either the "all at once" imagery of my mobile phone or in wideband filter images.
NASA solved the problem by assigning different (i would say false) colors to the narrowband filter images. The cyan-green color of 502 nm,
███, was mapped as blue,
███. The red color of 657 nm,
███, was mapped as green,
███. Voila! Problem solved!
But the remaining and (to me) very serious problem is that with narrowband photography, you can choose whatever narrowband wavelength you want to and map it to any color of your choice. There need to be no correspondence between wavelength detected by your camera and filter and the color shown in your final picture.
To see the result of this, consider two versions of Hubble's narrowband portrait of planetary nebula NGC 6826:
So there is one yellow-green version of NGC 6826 and one blue one. But what does NGC 6826 really look like? What color is it really?
Maybe it is this color,
███, 502 nm, for doubly ionized oxygen. Many planetary nebulas are indeed dominated by doubly ionized oxygen, which is cyan-green in color. Not yellow-green. Not blue. Actually, narrowband photography often drives me crazy.
By the way, what colors are the Pillars of Creation really in RGB (red, green, blue) filters? Maybe like this:
Finally, zendae, you asked about dust. Yes, the intrinsic color of cosmic dust is indeed dark reddish or dark brown, which in itself is a version of orange.
Okay, one more. Take a look at one of the absolutely most colorful regions of the sky, the Antares/Rho Ophiuchi region:
The picture shows bright red supergiant Antares, which is really yellow-orange in color. Antares is surrounded by a large yellow reflection nebula. At top is Rho Ophiuchi, a multiple star of spectral class B2, surrounded by a large blue reflection nebula. At right is Sigma Scorpii, a binary star of spectral class B1, surrounded by a reddish-pink hydrogen emission nebula, which is most prominent on the star's right (west) side. In the lower left corner B0V-type star Tau Scorpii is just peeking out, surrounded by a large but dim red emission nebula. Dust lanes and dust bunnies are permeating the images. The thicker the dust is, the darker it is. In places the dust is light brown, almost orange.
Oh, and by the way: Adam Block's picture of the Antares/Rho Ophiuchi region is certainly a "true-color" wideband RGB image! Possibly with the addition of a bit of Hα, to pick up more of the ionized red hydrogen emission.
Ann
[quote=zendae post_id=330173 time=1680498873 user_id=143056]
Thank you Ann. It almost sounds like a giant centrifuge of sorts - not working exactly the same way, but having similar results.
Well, we have a limited number of wavelength colors, though many shades. Does each element correspond to a color shade? Can we have 98 shades of color providing the elements are glowing? Do some (or many) never glow? Can we detect non-glowing ones and still ascribe a correct color to them or is that impossible?
For instance, in my a/v field, we cannot have a "brown" light gel expecting to throw "brown" light onto a stage, because isn't "brown" merely very low amplitude yellow? Shine a bright white light thru a brown-colored gel you kinda get yellow. (I am way more audio than visual so I know I'm guessing a bit here). But I am curious if we could possibly see a really colorful nebula if the time were taken to put the thing through lotsa frequencies. I hope I'm making sense here...
[/quote]
Zendae, you wrote:
[quote]Well, we have a limited number of wavelength colors, though many shades.[/quote]
I'd say we have many many (visible) wavelengths, but our eyes, our cameras, our filters and our processing techniques limit what we can see.
[img3="A high-resolution version of the spectrum of our Sun. The images shown here were created to mimic an echelle spectrum, with wavelength increasing from left to right along each strip, and from bottom to top. Each of the 50 slices covers 60 angstroms, for a complete spectrum across the visual range from 4000 to 7000 angstroms."]https://imageio.forbes.com/blogs-images/startswithabang/files/2016/05/sunx-1200x800.jpg?format=jpg&width=1200[/img3]
There are huge numbers of wavelengths in the Sun's visible spectrum. It would have been possible, certainly theoretically possible, to create an even more detailed solar spectrum showing even more (visible) wavelengths of the Sun than are seen in the picture above.
In the spectrum, you can see that there are huge numbers of little dark lines showing up everwhere. These are absorption lines, corresponding to wavelengths that have been absorbed (blackened out) by elements in the outer solar atmosphere. However, it is possible (under certain circumstances) to make the elements that "stole" these wavelengths "give them back" again:
[img3="Absorption and emission wavelengths of hydrogen. Credit: Khan Academy."]https://cdn.kastatic.org/ka-perseus-images/6eb47f45dd190c0e2b524f986228c8c9de1ba584.jpg[/img3]
So you could, theoretically, have colored light glowing from all the positions of the black lines in the solar spectrum.
Let's talk about photography. When it comes to photography, there are so many factors that limit what can be seen. Since I am a non-photographer, you shouldn't ask me too many questions about it.
Nevertheless, I should be able to say something about the photographic film or filters used to detect color. Take a look at two pictures that I took yesterday with my cheap little mobile phone:
[float=left][attachment=1]Vitsippan ser blå ut Slottsparken 3 april 2023.jpg[/attachment][c][size=85][color=#0040FF]The wood anemone (top) looks blue.
How weird.[/color][/size][/c][/float][float=right][attachment=0]Vitsippa Slottsparken 3 april 2023.jpg[/attachment][c][size=85][color=#0040FF]True-color white wood anemone.[/color][/size][/c][/float]
[clear][/clear]
The picture at left shows two flowers, a wood anemone and a yellow star of Bethlehem. The wood anemone looks blue. The picture at right shows what the wood anemone really looks like: It is white. But the camera in my mobile phone decided to make the wood anemone look blue in the first picture (where the flower is in the shade), and I had no say in the matter.
My mobile phone camera uses no filters, and the picture is taken "all at once". But you can also take filter images. That means that you photograph the same object through different filters that are sensitive to different wavelengths. Take a look at the Hubble Heritage "filter images" of galaxy NGC 4650A:
[img3="Near infrared (814 nm), yellow-orange (606 nm) and blue (450 nm) filter images of NGC 4650A. All filters are wideband."]https://i0.wp.com/illuminateduniverse.org/wp-content/uploads/2019/07/large3_600.jpg?resize=440%2C600&ssl=1[/img3]
From these "filter images" you create a composite color picture. Here you can see some suggested images and the one that was actually chosen:
[float=left][img3="Some suggested color versions of NGC 4650A created from the filter images."]https://i0.wp.com/illuminateduniverse.org/wp-content/uploads/2019/07/versions3.jpg?resize=550%2C400&ssl=1[/img3][/float][float=right][img3="The final image chosen for publication."]https://i0.wp.com/illuminateduniverse.org/wp-content/uploads/2019/07/4650color_500.jpg?resize=311%2C485&ssl=1[/img3][/float]
[clear][/clear]
The filters used for the portrait of NGC 4650 were all wideband, i.e, all three filters were sensitive to a wide range of wavelengths centered on 814 nm, 606 nm and 450 nm, respectively. The final image depended on the combination of the sensitivity of the three filters at different wavelengths (and on the people working with the filter images and their abilities and aesthetic choices).
But then there are narrowband images:
[img3="The Pillars of Creation. Narrowband image by Hubble from 1995."]https://upload.wikimedia.org/wikipedia/commons/thumb/b/b2/Eagle_nebula_pillars.jpg/1024px-Eagle_nebula_pillars.jpg[/img3]
The iconic Pillars of Creation image from 1995 is Hubble's first narrowband image. The filters used were 502 nm, [color=#00ff7b]███[/color], for doubly ionized oxygen, 657 nm (Hα + [N II]) [color=#FF0000]███[/color], for ionized hydrogen and nitrogen, and 673N (S II) [color=#FF0000]███[/color], for ionized sulfur.
As you can see, ionized hydrogen, ionized nitrogen and ionized sulfur look identical to the human eye, and it is impossible to show the difference between these three ionized elements in either the "all at once" imagery of my mobile phone or in wideband filter images.
NASA solved the problem by assigning different (i would say false) colors to the narrowband filter images. The cyan-green color of 502 nm, [color=#00ff7b]███[/color], was mapped as blue, [color=#005cff]███[/color]. The red color of 657 nm, [color=#FF0000]███[/color], was mapped as green, [color=#008000]███[/color]. Voila! Problem solved!
But the remaining and (to me) very serious problem is that with narrowband photography, you can choose whatever narrowband wavelength you want to and map it to any color of your choice. There need to be no correspondence between wavelength detected by your camera and filter and the color shown in your final picture.
To see the result of this, consider two versions of Hubble's narrowband portrait of planetary nebula NGC 6826:
[float=left][img3="NGC 6826. Credit: Bruce Balick (University of Washington), Jason Alexander (University of Washington), Arsen Hajian (U.S. Naval Observatory), Yervant Terzian (Cornell University), Mario Perinotto (University of Florence, Italy), Patrizio Patriarchi (Arcetri Observatory, Italy) and NASA"]https://upload.wikimedia.org/wikipedia/commons/8/83/NGC_6826HSTFull.jpg[/img3][/float][float=right][img3="NGC 6826. Credit: Rudy Pohl from Hubble Legacy Archive data."]https://cdn.astrobin.com/thumbs/Xzy2JROEIVlF_1824x0_kWXURFLk.jpg[/img3][/float]
[clear][/clear]
So there is one yellow-green version of NGC 6826 and one blue one. But what does NGC 6826 really look like? What color is it really?
Maybe it is this color, [color=#00ff7b]███[/color], 502 nm, for doubly ionized oxygen. Many planetary nebulas are indeed dominated by doubly ionized oxygen, which is cyan-green in color. Not yellow-green. Not blue. Actually, narrowband photography often drives me crazy.
By the way, what colors are the Pillars of Creation really in RGB (red, green, blue) filters? Maybe like this:
[img3=""]http://cdn.spacetelescope.org/archives/videos/videoframe/heic1501f.jpg[/img3]
Finally, zendae, you asked about dust. Yes, the intrinsic color of cosmic dust is indeed dark reddish or dark brown, which in itself is a version of orange.
Okay, one more. Take a look at one of the absolutely most colorful regions of the sky, the Antares/Rho Ophiuchi region:
[img3="The Antares/Rho Ophiuchi region. Credit: Adam Block/Steward Observatory/University of Arizona"]https://upload.wikimedia.org/wikipedia/commons/thumb/4/46/Antares_and_Rho_Ophiuchi_by_Adam_Block.jpg/800px-Antares_and_Rho_Ophiuchi_by_Adam_Block.jpg[/img3]
The picture shows bright red supergiant Antares, which is really yellow-orange in color. Antares is surrounded by a large yellow reflection nebula. At top is Rho Ophiuchi, a multiple star of spectral class B2, surrounded by a large blue reflection nebula. At right is Sigma Scorpii, a binary star of spectral class B1, surrounded by a reddish-pink hydrogen emission nebula, which is most prominent on the star's right (west) side. In the lower left corner B0V-type star Tau Scorpii is just peeking out, surrounded by a large but dim red emission nebula. Dust lanes and dust bunnies are permeating the images. The thicker the dust is, the darker it is. In places the dust is light brown, almost orange.
Oh, and by the way: Adam Block's picture of the Antares/Rho Ophiuchi region is certainly a "true-color" wideband RGB image! Possibly with the addition of a bit of Hα, to pick up more of the ionized red hydrogen emission.
Ann