We've snuck a new beta feature (beta means still has some rough edges - you have been warned...) into both the Pro and LTI versions of TheSkyX with last week's posted Daily Build. Live Stacking!
What is live stacking, and why should you care? Well, if you're doing science imaging, surveys, automated data collection, and long exposure astrophotography, you can just move along, nothing to see here.
If you have ever had anyone watch you image though... or tried imaging at an outreach event or public venue, then this is for you. If you've never tried this you should too.. a couple of reasons:
1. I do visual outreach all the time. Some people just cannot see an item in the eyepiece no matter what you do. A real bummer. Anyone, can see a computer screen, TV monitor, or iPad display. This is great for events where people might have disabilities as well; people who use wheelchairs, for example, who may be unable to access the eyepiece. (I have yet to figure out a way to accommodate the vision-impaired, but if someone smarter than me comes up with something PLEASE let me know!).
2. You can see SO MUCH MORE with a camera than with your eye. Please, for the love of all that is holy and celestial stop showing the uninitiated public M31 in the parking lot of Starbucks... 1 in 50 people are impressed. To everyone else other than you and a few hard core passers by, it looks like snot on the eyepiece. Now... a few seconds with a camera... and MY OH MY how things change! Hey look, M51 looks like an actual galaxy! No... TWO galaxies! What about light pollution? Bin with a light pollution or Ha filter, it works wonders... Remember, the goals of live stacking and outreach are vastly different than the typical dark sky adventure.
I'm a reasonably accomplished imager, and I can tell you now ahead of time a live stacked image is not going to meet my standards for a "pretty picture" in my gallery. But they ARE impressive, especially to the novice. A first image of a few seconds actually looks like a galaxy or the nebula you are shooting. Only a few seconds later, another image is automatically aligned and combined and it get's clearer. No one (in the public) wants to wait for a five-minute exposure, but if the images changes every 20 or 30 seconds, you can quickly draw a crowd. I tried this publicly for the first time at the Grand Canyon Star Party and it was a huge hit.
Live Stacking does not save the individual FITS files (I will likely change this for the next update), but you can save a .SER archive file. SER files are the FITS of video and so it's basically a stream of images. More SER tools will likely be coming down the road, and already LTI can do "lucky imaging" creating 16-bit SER files with any supported camera plug-in. I've been working with ZWO for starters to get the frame rate up to be quite competitive for this purpose. I"m already getting better frame rates on my Mac than I used to get with another program I had to use on Windows for this purpose.
So.. BETA.. I repeat ;-). Give it a go though and tell us what you think. There is more to come....
11-05-2018 8:10 AM