The Milky Way Over Ontario (APOD 29 Jul 2008)
- emc
- Equine Locutionist
- Posts: 1307
- Joined: Tue Jul 17, 2007 12:15 pm
- AKA: Bear
- Location: Ed’s World
- Contact:
The Milky Way Over Ontario (APOD 29 Jul 2008)
This beautiful capture would be an awesome setting for a bountiful backyard birthday barbeque… pass the meatballs please!
- Pete
- Science Officer
- Posts: 145
- Joined: Sun Jan 01, 2006 8:46 pm
- AKA: Long John LeBone
- Location: Toronto, ON
Amazing photo.
Ontario's a big place; on her website, the photographer specifies the location as Binbrook (southern ON, near Hamilton).
I need to get away from city skies...
Ontario's a big place; on her website, the photographer specifies the location as Binbrook (southern ON, near Hamilton).
I need to get away from city skies...
- orin stepanek
- Plutopian
- Posts: 8200
- Joined: Wed Jul 27, 2005 3:41 pm
- Location: Nebraska
- Chris Peterson
- Abominable Snowman
- Posts: 18595
- Joined: Wed Jan 31, 2007 11:13 pm
- Location: Guffey, Colorado, USA
- Contact:
Sure, lots of people do it. You can use an ordinary astronomical CCD camera, or an integrating video camera. Either way, it only takes a few seconds integration time to bring out much more than the eye can see, with or without optics. That's nearly real time. Of course, if you want to collect light even longer, the detail becomes incredible.BMAONE23 wrote:Anyone able to try this?
Telescope with tracking mount - Eyepiece connected to ccd imager (W/remote) - Hardwired to PC moniter or Laptop. Sit back and watch the show on the big screen. :D
Chris
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
Which camera?
Could you give me some more details please? Which brand of astronomical CCD camera? I tried some 15 seconds exposure time (that is the maximum) with a Canon Powershot, 6 MPixel. Not particulary lots of stars, just as much as the eye can see. Not yet connected: a Philips PCVC 750K webcam, with a removable lens. In your opinion: (Tick appropriate)Chris Peterson wrote:You can use an ordinary astronomical CCD camera, or an integrating video camera. Either way, it only takes a few seconds integration time to bring out much more than the eye can see, with or without optics.
- Junk?
- Nice for a church during holydays?
- Good stuff?
Regards,
Henk
21 cm: the universal wavelength of hydrogen
Henk
21 cm: the universal wavelength of hydrogen
- Chris Peterson
- Abominable Snowman
- Posts: 18595
- Joined: Wed Jan 31, 2007 11:13 pm
- Location: Guffey, Colorado, USA
- Contact:
Re: Which camera?
With a 15-second exposure using the PowerShot, you should see quite a few more stars than you can with your eye. Digicams can have problems focusing on the sky, however, and it only takes a tiny focus error to significantly reduce sensitivity.henk21cm wrote:Could you give me some more details please? Which brand of astronomical CCD camera? I tried some 15 seconds exposure time (that is the maximum) with a Canon Powershot, 6 MPixel. Not particulary lots of stars, just as much as the eye can see. Not yet connected: a Philips PCVC 750K webcam, with a removable lens. In your opinion: (Tick appropriate)
- Junk?
- Nice for a church during holydays?
- Good stuff?
The webcam is potentially very capable, depending on the software you use. For deep sky objects, you want to collect and stack many frames, and you want each frame to be exposed as long as the webcam will allow.
Integrating video cameras, such as the StellaCam or MallinCam, can expose internally for several seconds, while still maintaining a standard video output suitable for connecting to a video monitor. These provide a very simple, and reasonably cost effective way to do near real-time video imaging.
Conventional astronomical CCD cameras, like those made by SBIG, can do something similar, by continuously shooting and stacking exposures that are several seconds long (or longer). These cameras are significantly more sensitive than any of the video options, and require a computer, but provide by far the best results.
Chris
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
Re: Which camera?
G'day Chris,
You triggered another question: stacking of frames. As far as i understand stacking of frames reduces noise whereas the features in the subject are enhanced: noise increases with the square root of the number of frames, the subject increases with the number of frames. Popular software like Registaxuses this method successfully. From what you write i get the idea -possibly my error- that the sensitivity increases as well. Is that my misconception or is it true?
Do you prefer to read the raw (not yet Bayered) images or do you just read the JPG's? Is there a trick -in stead of the Bayer algorithm- to enhance the image by stacking the greens, the red and the blue to one grayscalish 'intensity' pixel?
Philips has some additional software packed with the cam. That is not very versatile. I'm still looking for an API of that camera, since than i can write software if my own. The same for the Canon camera.you wrote:The webcam is potentially very capable, depending on the software you use. For deep sky objects, you want to collect and stack many frames, and you want each frame to be exposed as long as the webcam will allow.
You triggered another question: stacking of frames. As far as i understand stacking of frames reduces noise whereas the features in the subject are enhanced: noise increases with the square root of the number of frames, the subject increases with the number of frames. Popular software like Registaxuses this method successfully. From what you write i get the idea -possibly my error- that the sensitivity increases as well. Is that my misconception or is it true?
Is their threshold (level below which light is not noticed) lower? If you need a computer, an Ethernet connection would be fine, since a PC working in the morning dew is not a task suitable for whatever PC.Chris wrote:Conventional astronomical CCD cameras, like those made by SBIG, can do something similar, by continuously shooting and stacking exposures that are several seconds long (or longer). These cameras are significantly more sensitive than any of the video options, and require a computer, but provide by far the best results.
Do you prefer to read the raw (not yet Bayered) images or do you just read the JPG's? Is there a trick -in stead of the Bayer algorithm- to enhance the image by stacking the greens, the red and the blue to one grayscalish 'intensity' pixel?
Regards,
Henk
21 cm: the universal wavelength of hydrogen
Henk
21 cm: the universal wavelength of hydrogen
Re: Which camera?
AFAIK, one could average the frames (mainly noise reduction in e.g. planetary photography), one could add up the energy values (to get a 'deeper' image of combined exposure time, for faint objects e.g. nebula) or a combination of both. One could stack frames in one of several ways.henk21cm wrote:stacking of frames.
Something like this:
- Chris Peterson
- Abominable Snowman
- Posts: 18595
- Joined: Wed Jan 31, 2007 11:13 pm
- Location: Guffey, Colorado, USA
- Contact:
Re: Which camera?
Usually, the most meaningful way to define sensitivity is in terms of signal-to-noise. A low noise camera might collect half as much signal as a high noise camera (that is, the low noise camera is less sensitive to light), but it will still show more detail. While cameras are often rated in terms of quantum efficiency (QE), or the percentage of incident photons they will record, S/N is far more important. With video cameras, nearly all have about the same QE, but some are much more sensitive, simply because they control noise better.henk21cm wrote:You triggered another question: stacking of frames. As far as i understand stacking of frames reduces noise whereas the features in the subject are enhanced: noise increases with the square root of the number of frames, the subject increases with the number of frames. Popular software like Registaxuses this method successfully. From what you write i get the idea -possibly my error- that the sensitivity increases as well. Is that my misconception or is it true?
When you stack, you boost the S/N, and therefore you boost the sensitivity. In the stacked image, you will be able to detect fainter features.
Generally, electronic image sensors have no lower limit. They simply record a certain percentage of incident photons- typically 30-50% for consumer type devices. The longer you collect, the more signal you have. The minimum signal that you can measure is determined by the noise. It is common to consider a S/N of 3 to mark the threshold of detection.Is their threshold (level below which light is not noticed) lower?
I prefer to avoid color in general. The only color camera I use is a webcam, and I save its individual frames already de-Bayered, but without compression (or losslessly compressed). In some cases I do convert to grayscale by weighting and summing the individual color pixels. This provides a boost in S/N at the expense of the color information. There are some advantages to working with raw frames from the camera, but this is often not possible, or may require specially modified hardware or software.Do you prefer to read the raw (not yet Bayered) images or do you just read the JPG's? Is there a trick -in stead of the Bayer algorithm- to enhance the image by stacking the greens, the red and the blue to one grayscalish 'intensity' pixel?
Chris
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
- Chris Peterson
- Abominable Snowman
- Posts: 18595
- Joined: Wed Jan 31, 2007 11:13 pm
- Location: Guffey, Colorado, USA
- Contact:
Re: Which camera?
In practice, there is seldom a difference between the two, unless you restrict the depth of the workspace- that is, you convert to something like 16-bit integers after the operation. If you don't limit your data depth, the S/N is exactly the same after averaging or summing. The actual values of individual pixels will be different, but since you normally stretch the data for display, either will look exactly the same.Case wrote:AFAIK, one could average the frames (mainly noise reduction in e.g. planetary photography), one could add up the energy values (to get a 'deeper' image of combined exposure time, for faint objects e.g. nebula) or a combination of both.
Chris
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
Re: Which camera?
Nice visualization Case. I'll focus on the last row of your image. White shades have characteristically a value of 192 and higher. Lets assume 192. When you add up 10 images, the whites add up to 1920 or higher. The image itself can only cope with 0-255. In stead of unsigned 8 bits integers, you will have to switch to unsigned 16 bits integers. Then, when saving an image, you will have to renormalize the image to the range between 0-255. As a result you will have a similar image as in the first row (adding and then divide by n, hoc loco 10).Case wrote:One could stack frames in one of several ways.
Something like this:
Apart from stretching the image: shift the lowest pixel value to 0, calculate the difference between the highest and lowest pixel value and multiply the difference by a factor until the highest shifted pixel value is again 255, there is no difference between the two methods.
Regards,
Henk
21 cm: the universal wavelength of hydrogen
Henk
21 cm: the universal wavelength of hydrogen
Re: Which camera?
Earlier this season, when the sky was clear (2007-11-17), i did a 15 second exposure on the Pleiades. The image is rather noisy. So i agree, noise is definitively an important factor.Chris Peterson wrote:Usually, the most meaningful way to define sensitivity is in terms of signal-to-noise. A low noise camera might collect half as much signal as a high noise camera (that is, the low noise camera is less sensitive to light), but it will still show more detail.
With that in mind, next time i see the Pleiades, i'll take a lot of images, and stack them.Chris Peterson wrote:When you stack, you boost the S/N, and therefore you boost the sensitivity. In the stacked image, you will be able to detect fainter features.
A CCD is a charge coupled device. Incident photons rip off electrons and as a result one plate of the capacitor get a tiny electric charge. Since the capacitor is etched on a semiconductor surface, it is not an ideal capacitor: it leaks. If the exponential decay time of the RC network is short, compared to the exposure time, the charge dissipates. A lot of charge dissipates as well and as fast as a small charge, but since an exponential decay never reaches zero, the remaining charge of a high charge might be just sufficient to be detected. That was the 'picture' so far. From what you write, i get the idea that thermal exitation is far more important than leak current and leak time.Chris Peterson wrote:Generally, electronic image sensors have no lower limit. They simply record a certain percentage of incident photons- typically 30-50% for consumer type devices. The longer you collect, the more signal you have. The minimum signal that you can measure is determined by the noise. It is common to consider a S/N of 3 to mark the threshold of detection.
There is an article on the Codeproject website on the wrapper class for the Canon API. I'll dig into that article and see whether i can get raw iamges working. The webcam is next.And about throwing away the color you wrote:This provides a boost in S/N at the expense of the color information. There are some advantages to working with raw frames from the camera, but this is often not possible, or may require specially modified hardware or software.
Regards,
Henk
21 cm: the universal wavelength of hydrogen
Henk
21 cm: the universal wavelength of hydrogen
- Chris Peterson
- Abominable Snowman
- Posts: 18595
- Joined: Wed Jan 31, 2007 11:13 pm
- Location: Guffey, Colorado, USA
- Contact:
Re: Which camera?
Some CMOS detectors store charge on capacitors. CCDs do not, they store charge in potential wells established by precise voltages on carefully placed electrode structures. Charge dissipation in CCDs is extremely low; I've worked with CCD cameras utilizing single exposures of several months, with no loss of stored charge. The problem is that thermal electrons can be created, and these are indistinguishable from signal electrons. Thermal electron generation is reduced by cooling the sensor. For extremely long exposures, such as I mentioned above, cryogenic cooling to LN2 temperatures is common; for shorter exposures of a few hours or less, thermoelectric cooling is typically used, with typical temperatures at around -20°C. This cooling may reduce thermal noise to near zero, leaving only readout noise (minimized by using fewer individual exposures), and the inevitable statistical noise on the signal itself.henk21cm wrote:A CCD is a charge coupled device. Incident photons rip off electrons and as a result one plate of the capacitor get a tiny electric charge. Since the capacitor is etched on a semiconductor surface, it is not an ideal capacitor: it leaks...
Chris
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
Re: Which camera?
Chris Peterson wrote:In practice, there is seldom a difference between the two [...] either will look exactly the same.
Thanks for the clarification, guys. That makes perfect sense.henk21cm wrote:[...] there is no difference between the two methods.
Re: Which camera?
It was late seventies, or early eighties that one of my educators told me about CCD's. In these days it was 'capacitors', that is why i brought in the notion of capacitors. The wikipedia article on CCD's must date from that time as well:Chris Peterson wrote:Some CMOS detectors store charge on capacitors. CCDs do not, they store charge in potential wells established by precise voltages on carefully placed electrode structures.
<<Quote: Not all image sensors use CCD technology; for example, CMOS chips are also commercially available. :etouQ>>
which confirms your statement. You can update the rest of the article in the wiki:
<<Quote: An image is projected by a lens on the capacitor array (the photoactive region), causing each capacitor to accumulate an electric charge proportional to the light intensity at that location. :etouQ>>
where the 30 year old capacitors pop up again.
OK, losing charge is no problem within 15 s, if you did not lose any charge after several months. Thermal noise as usual is the culprit. When doing long exposure photographs on ordinary photographic film, the Schwarzschild effect was degrading long exposure effective sensitivity of the film. Is something similar the case with CCD's? Never read something about that foor CCD's. Did you?you fortunately wrote:Charge dissipation in CCDs is extremely low; I've worked with CCD cameras utilizing single exposures of several months, with no loss of stored charge. The problem is that thermal electrons can be created, and these are indistinguishable from signal electrons.
Next wacky idea came up:
- Take 36 images of 15 seconds.
- Take 108 images of 5 seconds.
Regards,
Henk
21 cm: the universal wavelength of hydrogen
Henk
21 cm: the universal wavelength of hydrogen
- Chris Peterson
- Abominable Snowman
- Posts: 18595
- Joined: Wed Jan 31, 2007 11:13 pm
- Location: Guffey, Colorado, USA
- Contact:
Re: Which camera?
It's true that the pixel structure contains something like a capacitor for storing the initial charge. In most cases, this is still quite different in operation from a conventional capacitor, however. For instance, many CCDs allow you to apply a bias to the pixel that actually prevents photoelectrons from being created- basically, an electronic shutter. I built a camera in the 1970s, using a first generation CCD. The devices have improved quite a lot since then!henk21cm wrote:It was late seventies, or early eighties that one of my educators told me about CCD's. In these days it was 'capacitors', that is why i brought in the notion of capacitors.
There is no analog in CCDs to reciprocity failure in film.OK, losing charge is no problem within 15 s, if you did not lose any charge after several months. Thermal noise as usual is the culprit. When doing long exposure photographs on ordinary photographic film, the Schwarzschild effect was degrading long exposure effective sensitivity of the film. Is something similar the case with CCD's? Never read something about that foor CCD's. Did you?
As a rule, the example with fewer images will have better S/N. They both have the same signal, and they both have the same dark current signal (and therefore, dark current noise). But each time you read the camera, you inject readout noise. If you have R noise in a single exposure, you'll have 6R in 36 exposures, and 10R in 108 exposures. That's why you always want to try for fewer longer exposures. It's also why no video camera approaches a long exposure camera for imaging dim astronomical objects. For short exposures, readout noise simply swamps the signal.Next wacky idea came up:
Both have the same cumulative exposure time. When the frames are added, the resulting two images must be the same. The S/N ratio of the latter must be even slightly better, √3. From what i understand from your reply, there should not be a difference between both techniques, regarding the faintest star visible. Is that correct and congruent with your experiences?
- Take 36 images of 15 seconds.
- Take 108 images of 5 seconds.
Chris
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
Re: Which camera?
Ah, the readout noise -thanks for learning me a new concept- breaks the symmetry. Although the readout noise limits the applicability of my camera, it leaves logic intact. The outcome of my wacky idea would have been anti-logical: "To do difficult things, like capturing dim objects, it requires the most elaborate way to get there. "Chris Peterson wrote: As a rule, the example with fewer images will have better S/N. But each time you read the camera, you inject readout noise.
So there are four contrinutions:
S: the wanted signal
D: the dark current noise
R: the readout noise
T: the thermal noise.
About D you said that it is small, maybe negligible compared to T. The ratio between R and T is unknown. I suggest it (R/T) is smaller than 1. Since R and T are independent contributions, you can add up their respective contributions quadratically. I suppose the T contribution (thermal noise energy) is proportional to the exposure time, just as the wanteed signal. Since R is independent of time, the lesser the amount of frames, the lesser the R contribution. That leaves me no other choise than the maximum exposure time: 15 seconds.
Regards,
Henk
21 cm: the universal wavelength of hydrogen
Henk
21 cm: the universal wavelength of hydrogen
- Chris Peterson
- Abominable Snowman
- Posts: 18595
- Joined: Wed Jan 31, 2007 11:13 pm
- Location: Guffey, Colorado, USA
- Contact:
Re: Which camera?
D and T are the same thing. The dark current shows as a steady increase in charge, related to temperature. This can be subtracted from the image, which is the main purpose of a dark frame. What can't be subtracted is the noise component of the dark current, nominally the square root of the dark current signal.henk21cm wrote: So there are four contrinutions:
S: the wanted signal
D: the dark current noise
R: the readout noise
T: the thermal noise.
There is also noise associated with the signal itself. The statistical noise is just the square root of the signal, which is why you want as many photons as possible. When you make astronomical images, there is also signal from the sky background. This also has a noise component that can't be removed.
The noise sources aren't all characterized by Poisson or Gaussian behavior, but to a reasonable approximation you can simply add them quadratically to evaluate their total contribution.
Chris
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
Re: Which camera?
Chris, back in town, i found your reply. I have read the words "dark frame" earlier and hadn't a clue what it was. From what your reply makes me understand: there is an increase in pixel brightness, for which i suppose it is proportional to time (steady increase). So if you take an image of 5 minutes, next thing you do is put the cap on the lens, take an image of 5 minutes complete darkness, without any subject and subtract this dark image from your first 5 minute image. That reduces the 'background light', due to dark current. This dark current is caused by thermal exitations, across the band gap (ΔE/k) in semiconductors.Chris Peterson wrote:The dark current shows as a steady increase in charge, related to temperature. This can be subtracted from the image, which is the main purpose of a dark frame.
So there must be a systematic equally distributed part and a random part.What can't be subtracted is the noise component of the dark current, nominally the square root of the dark current signal.
Rayleigh scattering of light, like the sun scatters light. Red is scattered less than blue, so the sky is blue.When you make astronomical images, there is also signal from the sky background. This also has a noise component that can't be removed.
Regards,
Henk
21 cm: the universal wavelength of hydrogen
Henk
21 cm: the universal wavelength of hydrogen
what about natural viewing?
All,
Are there places you can go to see deep details without light-gathering? How far out from the city do you have to go? What locales or countries (that are safe) are good for star-gazing, in case I'm ever on vacation somewhere cool?
I'm talking naked-eye stuff.
Lewis
Are there places you can go to see deep details without light-gathering? How far out from the city do you have to go? What locales or countries (that are safe) are good for star-gazing, in case I'm ever on vacation somewhere cool?
I'm talking naked-eye stuff.
Lewis
- Chris Peterson
- Abominable Snowman
- Posts: 18595
- Joined: Wed Jan 31, 2007 11:13 pm
- Location: Guffey, Colorado, USA
- Contact:
Re: what about natural viewing?
My basic answer would be no. That's not to say you can't go to a good dark-sky site and be overwhelmed by the Milky Way. But you won't see color, and even with a telescope you won't approach the detail that even a cheap camera can manage with just a few seconds exposure. While bad skies can hide everything, the fundamental limitation is the sensitivity of the eye, not the sky.Animation wrote:Are there places you can go to see deep details without light-gathering? How far out from the city do you have to go? What locales or countries (that are safe) are good for star-gazing, in case I'm ever on vacation somewhere cool?
If you want dark skies, you need to be at least 20 or 30 miles from any cities. Also, a surprising amount of light pollution is local. Just a few neighbors with bad outside lighting can ruin your sky as much as a giant city 10 miles away. A small town just a few miles away can be worse than a city 20 miles away.
There are many places with dark skies. In the U.S., every state has good sites. Vast areas west of the Mississippi are dark. And of course, many undeveloped countries have good dark sites. There isn't a big difference in darkness between the best sites in the world, and the best sites within 100 miles of wherever you live.
Chris
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
Re: what about natural viewing?
Going to a higher elevation, such as a mountaintop, far from the light pollution also helps. I did that a few times and noticed that I could see more stars than I could in the lowland.Chris Peterson wrote:
If you want dark skies, you need to be at least 20 or 30 miles from any cities. Also, a surprising amount of light pollution is local. Just a few neighbors with bad outside lighting can ruin your sky as much as a giant city 10 miles away. A small town just a few miles away can be worse than a city 20 miles away.
Gary
Fight ignorance!
- Chris Peterson
- Abominable Snowman
- Posts: 18595
- Joined: Wed Jan 31, 2007 11:13 pm
- Location: Guffey, Colorado, USA
- Contact:
Re: what about natural viewing?
Very true. I live at over 9000 feet elevation, and the skies here are somewhat darker than the models estimate. That's because there is less water vapor and fewer particulates to scatter light from the nearest city (40 miles away). I see a low light dome in that direction, but it doesn't spread to the surrounding sky.starnut wrote:Going to a higher elevation, such as a mountaintop, far from the light pollution also helps. I did that a few times and noticed that I could see more stars than I could in the lowland.
Chris
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com
*****************************************
Chris L Peterson
Cloudbait Observatory
https://www.cloudbait.com