New to Startools - can you do better with this 53 hour O3 master light?

Questions and answers about processing in StarTools and how to accomplish certain tasks.
User avatar
admin
Site Admin
Posts: 3367
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: New to Startools - can you do better with this 53 hour O3 master light?

Post by admin »

Hi Ram,
ramdom wrote: Wed Sep 09, 2020 5:39 am since banding is the issue you're referring to...
I'm admittedly very confused now.
This is your ou4_O3.v0 file, correct?
ou4_O3.v0.jpg
ou4_O3.v0.jpg (227.58 KiB) Viewed 3780 times
You then say you remove "banding" from this, and end up with this?

http://ram.org/images/space/downloads/ou4_O3.v0.1.fit

I don't see any banding in any of your stacked datasets you have shared so far?

I'm sure you've done vastly more research trying to fix this particular issue, since it has been plaguing you for a while, however I'm not 100% convinced banding is something you would find in your final dataset, looking this way; I would think that, if banding were present, it would have either calibrated out (in case of a fixed pattern) by your darks/bias frames or would have been stacked out (in case of a random pattern per frame, or in case of the same, fixed pattern in all frames, under the influence of dithering). It should not lead to this localized uneven lighting as observed in your datasets.

Are we really talking about the same thing here? (e.g. banding = perfectly horizontal or vertical stripes/bars in your individual frames?)
You can also try it out: just run CBE with the defaults on the v0 image after it is rotated 180 and you'll see a remarkable change when you redo the STF. I'd say 80-90% of the banding goes away after CBE and then the remaining 20% is gotten by DBE.
I don't own PixInsight, however using any of these tools will never 100% repair bad calibration or lost signal. Looking at your source dataset and then at your DBE result - I'm going to be 100% honest here - I do not (cannot) trust that the faint background detail in your images is real after DBE. DBE (or ST's Wipe equivalent) is not a substitute for removing signal undulating this fast and that was - I am 99% sure - never of celestial origin.
At the end I'm not sure where this leaves us. Time is precious and it's easier to just work around issues.
On the contrary; constantly working around issues is vastly more time consuming than it is to solve a fundamental problem once. It is always more optimal to address a cause than it is to repeatedly address symptoms. This problem is currently keeping you from achieving better results. You have hit a plateau; more data is not going to make your image better, as it is IMHO not calibrated well enough - the aberrant signal overwhelms any faint signal you might be adding.
However, I have to deal with the gear I have and the problems it generates and I really would've liked to see a difficult use case since what I've observed looking at other people's data is also that generating clean data is difficult (and this is probably my most difficult data set ever).

Achieving a "clean" (as in suitable for ST) dataset is not particularly hard. What is meant by "clean" in the context of StarTools, is simply well-calibrated and containing only Poissonian (shot) noise. The signal doesn't have to be noise free, just calibrated to be free of any other non-celestial influences (uneven lighting, dust, dead pixels, hot pixels, pattern noise), to the best of your abilities.
I did however take a look at thread you pointed to on SGL and the images done using ST are amazing - it is good to know that can be accomplished. But the conditions for that set are amazing - they spent 100 hours on data collection and used 40 hours of it - throwing out 60 hours worth! That's not a realistic proposition, right? We have to be able to do this without that effort - what would've happened had all 100 hours been used? But I do have other data sets of varying difficulties and cleanliness and I can compare and contrast.
You can use virtually the same workflow/defaults on this short exposure, imperfect DSLR dataset that was also not stacked according to ST best practices. (iI's from this old video) and you should be able to achieve something like this with the same basic workflow using mostly defaults (not processed to taste, just mostly defaults where possible - 1.7 now even cleans up the walking noise in this dataset, caused by not dithering).

I can't quite remember exactly, but I believe this was 40 minutes of exposure time, acquired under light polluted skies (a CLS filter was used) and shot with an old Canon 450D. Some sort of calibration was performed, but there are still dust donuts in the upper left corner and some defective sensor columns. No dithering was performed either.

StarTools certainly does not require deep data to be able to avail of its benefits. It just requires "honest" data (e.g. free of signal introducing defects) to the best of one's abilities.
But as a favour since you did put in the work, can you please send me your final version that you said "that's as far as I'm willing to push it" as a 32 bit fits? I can then examine it properly. Thanks a lot!
I only saved the JPEG, but I'll redo it and upload the 16-bit TIFF here (ST is a post-processing application only - it does not export data, only images!). I will let you know when it's up.

Clear skies!
Ivo Jager
StarTools creator and astronomy enthusiast
User avatar
ramdom
Posts: 13
Joined: Mon Sep 07, 2020 8:56 am
Location: Youngstown, NY USA
Contact:

Re: New to Startools - can you do better with this 53 hour O3 master light?

Post by ramdom »

Hi Ivo, once again, I appreciate you taking the time to talk about these issues with me. Yes, that's my v0 file, i.e., the corresponding FITS. So this one to be precise (http://ram.org/images/space/downloads/ou4_O3.v0.fit). And the v0.1 is indeed the file with both CBR (CanonBandingReduction) and DBE applied. If you take the v0 and apply CBR you'll see the large scale banding disappear (which I believe is what you're referring to as the "calibration problem").

We're indeed talking about the horizontal stripes/bands, but normally these are short and close together like in the first example I provided below - many of my frames have it here and there. But after stacking, I am left with basically three huge vertical stripes/bands. The first light stripe/band is taking about ~35% of the master, the second dark stripe/band is taking roughly the middle ~30%, and the third light stripe/band is taking the bottom ~35%. It's not small stripes/bands back and forth but in the final stacked master, there are only three big ones. Now, in individual frames, the level of banding varies and they tend to look more like what you expect presumably. But they are not consistent. Here's an example frame with a lot of small scale banding:

JPEG: http://ram.org/images/space/downloads/a ... _00003.jpg
Corresponding FITS: http://ram.org/images/space/downloads/a ... 00003.fits

Presumably this is what you are thinking of? So for terminology's sake, we'll call this banding "small scale banding" and we can call the one I'm referring to as "large scale banding". The problem is that the small scale banding varies from frame to frame . Most of my frames do not have it---they are clean/clear. Some frames will have anywhere from 0-100% (so 50% is half the frame is banded and the other half is clean). But the large scale banding is present in ALL my frames. Here's an example of just the previous frame which I consider reasonably clear of small scale banding but still has the large scale banding (the large scale banding I'm referring to is more obvious here):

JPEG: http://ram.org/images/space/downloads/a ... _00002.jpg
Corresponding FITS: http://ram.org/images/space/downloads/a ... 00003.fits

Now, whether we call the large scale banding as "banding" or something else, it is happening and CBR fixes it. CBR seems to work on both small scale and large scale banding. This is what we're getting hung up on. If you step back from the screen and take a look at the 2.jpg file, you'll see there are three stripes/bands. The top part is light, middle part is dark, bottom part is light again.

All that said, that's the problem. I have done the CBR directly after calibration, before registration and integration and that does work partially. But it still needs to be re-applied to fix the large scale banding completely/better. So perhaps there's an optimal setting that'll fix it for all frames prior to integration but finding it when one is dealing with 632 frames all with different variations on banding is not worth it IMO. The only calibration files I use are darks and flats at most. And most of my recent images I'm satisfied with to a 95% level don't use flats. Since I do collect flats for every set up I have, I have them and can apply them, but I don't see the need - I'm able to get equivalent outcomes without flats. I can provide you my darks and flats if you'd like/care to take look. I've seen flowcharts of proper calibration and while I have my own ideas about the need for flats, the minimal calibration workflow is basically combinations of darks and flats. I have done calibration once previously with dark flats, but it made no difference.

BTW, here's Jon Rista's take on the notion of banding as noise: https://jonrista.com/the-astrophotograp ... on-part-2/ and he advocates the use of CBR to remove it.

--

:) It's easy to say "solve the problem at the source" but you can only beat your head at the wall so much and it may just be the $1000 camera and there's always a trade off between time spent and the gain achieved (and the workarounds I use are very fast - my initial post processing only takes a few minutes). I have some friends who are more experienced imagers than me attempt to help and it has gotten nowhere. So workarounds are all I have ATM. It's also not clear how much better you can get with the best calibration. There may be incremental improvements but I've compared my images to those generated by others that I consider the absolute best and I'm surprised I'm able to get that close. To me my images are far from perfect and I can still see a lot of issues relatively to the absolute best ones, but calibration isn't where I see the problem being given my workarounds. I agree with you that DBE may sacrifice some signal but everything I'm getting in the end is consistent with what the best people are doing.

I am curious by what you mean when you say "I do not (cannot) trust that the faint background detail in your images is real after DBE." Which detail are you specifically referring to (or are you talking generally)? The way I ended up processing my image was to look at the best images I've seen and try to come close. So that "background detail" is actually masked out. But IMO, in some cases at least, some of the faint background detail is real. I've worked like this (mask out everything I consider unimportant) and then found OTHER images that had the faint background detail so I was wrong to mask it out. Now, I take your point - this is not the way to process images but given that we're all repeat imaging of the same target zillions of times, I don't see the issue.

--

Finally, I realise that by asking you to give me the FITS file I'm indirectly working around the ST demo version restriction (i.e., you're saving the file for me) so I'm happy to purchase a license. God knows I have so many software licenses that never get used so ethically at least it'd be the right thing to do. Buying the license isn't the issue as I'm sure it isn't for you as well. I really would like to produce equal or better images with less effort.

I think also I have to either try to play with other data sets (I wish I had known about the SGL competition ahead of time, that would've motivated me) or some of my older cleaner ones and see what I can do with ST. Even your image that produced was not bad. Let's see what I can do with it in PI and then perhaps we can see if that can be done in ST. We have to use workarounds at least for these data sets since I have this problem in all my frames. And I've seen others complain about this sort of thing too for all cameras with this chip even from ZWO.

Thanks!

--Ram
admin wrote: Thu Sep 10, 2020 7:43 am Hi Ram,
ramdom wrote: Wed Sep 09, 2020 5:39 am since banding is the issue you're referring to...
I'm admittedly very confused now.
This is your ou4_O3.v0 file, correct?

ou4_O3.v0.jpg

You then say you remove "banding" from this, and end up with this?

http://ram.org/images/space/downloads/ou4_O3.v0.1.fit

I don't see any banding in any of your stacked datasets you have shared so far?

I'm sure you've done vastly more research trying to fix this particular issue, since it has been plaguing you for a while, however I'm not 100% convinced banding is something you would find in your final dataset, looking this way; I would think that, if banding were present, it would have either calibrated out (in case of a fixed pattern) by your darks/bias frames or would have been stacked out (in case of a random pattern per frame, or in case of the same, fixed pattern in all frames, under the influence of dithering). It should not lead to this localized uneven lighting as observed in your datasets.

Are we really talking about the same thing here? (e.g. banding = perfectly horizontal or vertical stripes/bars in your individual frames?)
You can also try it out: just run CBE with the defaults on the v0 image after it is rotated 180 and you'll see a remarkable change when you redo the STF. I'd say 80-90% of the banding goes away after CBE and then the remaining 20% is gotten by DBE.
I don't own PixInsight, however using any of these tools will never 100% repair bad calibration or lost signal. Looking at your source dataset and then at your DBE result - I'm going to be 100% honest here - I do not (cannot) trust that the faint background detail in your images is real after DBE. DBE (or ST's Wipe equivalent) is not a substitute for removing signal undulating this fast and that was - I am 99% sure - never of celestial origin.
At the end I'm not sure where this leaves us. Time is precious and it's easier to just work around issues.
On the contrary; constantly working around issues is vastly more time consuming than it is to solve a fundamental problem once. It is always more optimal to address a cause than it is to repeatedly address symptoms. This problem is currently keeping you from achieving better results. You have hit a plateau; more data is not going to make your image better, as it is IMHO not calibrated well enough - the aberrant signal overwhelms any faint signal you might be adding.
However, I have to deal with the gear I have and the problems it generates and I really would've liked to see a difficult use case since what I've observed looking at other people's data is also that generating clean data is difficult (and this is probably my most difficult data set ever).

Achieving a "clean" (as in suitable for ST) dataset is not particularly hard. What is meant by "clean" in the context of StarTools, is simply well-calibrated and containing only Poissonian (shot) noise. The signal doesn't have to be noise free, just calibrated to be free of any other non-celestial influences (uneven lighting, dust, dead pixels, hot pixels, pattern noise), to the best of your abilities.
I did however take a look at thread you pointed to on SGL and the images done using ST are amazing - it is good to know that can be accomplished. But the conditions for that set are amazing - they spent 100 hours on data collection and used 40 hours of it - throwing out 60 hours worth! That's not a realistic proposition, right? We have to be able to do this without that effort - what would've happened had all 100 hours been used? But I do have other data sets of varying difficulties and cleanliness and I can compare and contrast.
You can use virtually the same workflow/defaults on this short exposure, imperfect DSLR dataset that was also not stacked according to ST best practices. (iI's from this old video) and you should be able to achieve something like this with the same basic workflow using mostly defaults (not processed to taste, just mostly defaults where possible - 1.7 now even cleans up the walking noise in this dataset, caused by not dithering).

I can't quite remember exactly, but I believe this was 40 minutes of exposure time, acquired under light polluted skies (a CLS filter was used) and shot with an old Canon 450D. Some sort of calibration was performed, but there are still dust donuts in the upper left corner and some defective sensor columns. No dithering was performed either.

StarTools certainly does not require deep data to be able to avail of its benefits. It just requires "honest" data (e.g. free of signal introducing defects) to the best of one's abilities.
But as a favour since you did put in the work, can you please send me your final version that you said "that's as far as I'm willing to push it" as a 32 bit fits? I can then examine it properly. Thanks a lot!
I only saved the JPEG, but I'll redo it and upload the 16-bit TIFF here (ST is a post-processing application only - it does not export data, only images!). I will let you know when it's up.

Clear skies!
Tubes: C925, SV70T, UC18, Tak FC100DF, FS128. Mounts: AVX; Paramount MyT. Glass: SV 0.8x, Tak 0.7x, 0.66x FR/FF. Cameras: QHY163M; QHY247C. Filters: Astrodon 5nm Ha, 3nm O3, S2. http://ram.org/ramblings/cosmos/equipment.html for a full list.
User avatar
admin
Site Admin
Posts: 3367
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: New to Startools - can you do better with this 53 hour O3 master light?

Post by admin »

ramdom wrote: Fri Sep 11, 2020 12:39 am Hi Ivo, once again, I appreciate you taking the time to talk about these issues with me. Yes, that's my v0 file, i.e., the corresponding FITS. So this one to be precise (http://ram.org/images/space/downloads/ou4_O3.v0.fit). And the v0.1 is indeed the file with both CBR (CanonBandingReduction) and DBE applied. If you take the v0 and apply CBR you'll see the large scale banding disappear (which I believe is what you're referring to as the "calibration problem").
Thank you Ram. The calibration problem I'm seeing in all your datasets so far is solely a problem with - presumably - your flats.
there are only three big ones.
These are not related to the banding; they are related to something else. I can see the uneven illumination in this single frame in the exact same spot as in your final stack. If it doesn't move, your flats (or bias if it is that) should be able to calibrate it out.

As a side note, the banding definitely looks like interference or like a groundloop hum. Have you tried some test frames, powering everything with a battery for example?
The large scale banding is present in ALL my frames.
Perfect - if it is always in the same place (or was it coincidence it was in the same place in your final stack and the single frame?), then you can calibrate it out! :)
CBR seems to work on both small scale and large scale banding. This is what we're getting hung up on. If you step back from the screen and take a look at the 2.jpg file, you'll see there are three stripes/bands. The top part is light, middle part is dark, bottom part is light again.
But I'm seeing the exact same thing in the 3.jpg image as well? (minus the small scale banding)
I don't see CBR doing anything to mitigate that (and indeed, it is not related to banding nor would it fix something like that from how I undetstand the sxript to work).

I can, however, see CBR leaving small unevenness, because, as people observe;
one thing i ran into with integrated images is that on multi-night images, the rotation angle could be slightly different for each night. this means the banding noise from each night is superposed at slight angles to the other nights, and it becomes impossible to remove completely
. This of course introduces artefacts (and besides not using flats, further makes the dataset less trustworthy if pushed hard).
mask out everything I consider unimportant
:( That's no longer signal processing though? Ideally you'd leave your data to speak for itself. Arbitrary manual manipulation is outside the scope of documentary photography. How do you accomplish that in PI by the way? I know the author is very much against this sort of arbitrary manipulation (I personally am also not a fan of it) and there are not many tools in PI that allow you to do that... ST is more lenient in this regard, and I don't feel it is my decision to tell people what they should do - whatever makes them happy. However, when I am presented with an image, I definitely would want to know if it purposefully contains anything that isn't real or omits something that should be there.
Finally, I realise that by asking you to give me the FITS file I'm indirectly working around the ST demo version restriction (i.e., you're saving the file for me) so I'm happy to purchase a license. God knows I have so many software licenses that never get used so ethically at least it'd be the right thing to do. Buying the license isn't the issue as I'm sure it isn't for you as well. I really would like to produce equal or better images with less effort.
:) Not needed at all Ram.

You can find the full TIFF here now;
https://download.startools.org/Tutorial ... ou4O3.tiff

As said, I don't trust the background at all and don't think any of the faint detail is real here. But at least you know that ST is able to preserve such detail when you produce it.

Clear skies!
Ivo Jager
StarTools creator and astronomy enthusiast
User avatar
ramdom
Posts: 13
Joined: Mon Sep 07, 2020 8:56 am
Location: Youngstown, NY USA
Contact:

Re: New to Startools - can you do better with this 53 hour O3 master light?

Post by ramdom »

Hi Ivo, that's what people who tried to help me before guessed: the problem was the flats (there was some thought it could be darks too). But they couldn't help me calibrate it out. Either my flats are being done incorrectly or there's something else. And my flats do remove the dust bunnies. But not that banding light gradient and I checked, my flats don't appear to have the light gradient. Perhaps the exposure needs to be a minimum length for these gradients/banding to manifest. My darks do have it but it is not as pronounced. But you're right that it is always in the same place and it is present in ALL frames. Both 2 and 3 have it and both 2 and 3 are uncalibrated FITS.

Here's 2 that is calibrated ONLY with CBR (default settings) and soft stretched, nothing else: no dark subtraction, DBE, etc. CBR works like a charm here:

http://ram.org/images/space/downloads/a ... r_only.jpg

The large scale banding is gone---the only thing left is the amp glow. So no, I don't agree CBR doesn't help -in fact, I think it solves the problem with the right setting but I've not bothered since the combination of default CBR and DBE does the trick entirely. My view on this if this is a form of pattern noise (per Jon Rista) and CBR removes it, why not? You're right that CBR can be destructive to signal as Jon Rista states if you're doing repeated passes but I find the first pass really not doing anything except removing the large scale banding in my frames.

BTW, I could omit DBE in my processing and I'll have a near identical result. If my calibration frames of flats and darks removed this large scale banding no one would be happier than me. But my thinking is that if I spent significant amounts and I solved this problem, the end result won't be that much different than what I am getting now. For instances my darks are less than a year old but they have been used for several months so perhaps I should redo the darks but I've done this before and there is no observable change (so I tend to do darks once a year/season). When I subtract the darks, I don't check the "calibrate" option - my darks are identically matched. Yet after dark subtraction, I'm still left with some amp glow and banding.

As far as using a mask to highlight stuff, I don't see it as being different from a ROI except that my ROI is an ellipse. There's a script in PI called GAME that lets you create masks arbitrarily and then I look at OTHER images and then use that as a guide. In this particular image, you can actually see there's a small O3 signal to the left of the main squid. That is real. Yet when I did DBE, I had it one of the DBE points right on it not realising it existed. Fortunately in this case, DBE didn't remove it (it is not that aggressive) or diminish it in any other way that I can tell. In terms of the other faint O3 signal, some of it may be real but it is so faint that it'd require a better scope/combination to suss it out. So only the Squid and the left side small signal is what is real signal in the frame based on other people's images. In the end, everything I produce is consistent with what other people have observed/imaged.

Thanks a lot for the TIFF - I will play with it and see if my methods produce a better output. But I think going back to a simpler/cleaner data set will help more.

--Ram
Tubes: C925, SV70T, UC18, Tak FC100DF, FS128. Mounts: AVX; Paramount MyT. Glass: SV 0.8x, Tak 0.7x, 0.66x FR/FF. Cameras: QHY163M; QHY247C. Filters: Astrodon 5nm Ha, 3nm O3, S2. http://ram.org/ramblings/cosmos/equipment.html for a full list.
User avatar
admin
Site Admin
Posts: 3367
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: New to Startools - can you do better with this 53 hour O3 master light?

Post by admin »

ramdom wrote: Sun Sep 13, 2020 12:29 am My darks do have it but it is not as pronounced.
Then that's where I would start.
The large scale banding is gone
I am so incredibly confused.... Calling (or concluding) something banding, just because an unrelated remedy called Canon Banding Reduction somewhat) hides the issue, is fallacious logic. By the same logic you could call your camera a Canon now?

Regardless of what you want to call the issue, it is present in all your datasets and images. All CBE does is renormalize every horizontal line. It's like a limited, 1-dimensional DBE. It's not solving the problem, it just hides it (a little) better.

Your dataset is still compromised by banding and an - unrelated - calibration problem, and stretching it shows that.
my thinking is that if I spent significant amounts and I solved this problem, the end result won't be that much different than what I am getting now.
Thinking that would be a grave mistake. :(
I don't want to use words that impart subjective judgement on your datasets, but they are not normal or indicative of what I see daily from comparable (or identical) gear.
As far as using a mask to highlight stuff, I don't see it as being different from a ROI except that my ROI is an ellipse.
If you mean the RoI in AutoDev (or Wipe), please don't confuse using a mask as a sampling area for input to process the entire image, with arbitrary selective manipulation of pixels. Where documentary photography ends is where manipulations are performed selectively based on outside information; you are no longer documenting reality as observed, you are inventing or augmenting it.
There's a script in PI called GAME that lets you create masks arbitrarily and then I look at OTHER images and then use that as a guide. In this particular image, you can actually see there's a small O3 signal to the left of the main squid. That is real. Yet when I did DBE, I had it one of the DBE points right on it not realising it existed. Fortunately in this case, DBE didn't remove it (it is not that aggressive) or diminish it in any other way that I can tell. In terms of the other faint O3 signal, some of it may be real but it is so faint that it'd require a better scope/combination to suss it out. So only the Squid and the left side small signal is what is real signal in the frame based on other people's images. In the end, everything I produce is consistent with what other people have observed/imaged.
If processed in such a way, yet presented as "real", then myself and most of your peers would want to know about it (and would respectfully disagree with you). I, for one, having seen the dataset(s), don't think aspects of your image are real, not even the image I produced for you (the background is too compromised). For example, as is alluded to before, depicting the "squid" as a uniformly lit volume of gas is not commensurate with reality.

By all means, I am a firm believer in doing whatever makes you happy, but if documentary photography (astrophotography being a sub-branch of that) is of interest, then you may wish to consult further with your peers on how to accomplish 1. a good dataset you can build on (e.g. get to the bottom of the banding and calibration issues), 2. get a clear idea of what they would find acceptable in terms of selective image manipulation.

Clear skies,
Ivo Jager
StarTools creator and astronomy enthusiast
User avatar
ramdom
Posts: 13
Joined: Mon Sep 07, 2020 8:56 am
Location: Youngstown, NY USA
Contact:

Re: New to Startools - can you do better with this 53 hour O3 master light?

Post by ramdom »

I'm calling it (large scale) banding because there are three large bands I can see - not just because CBR removes it (though that is corroborative). I could call it "striping", "odd lighting", whatever. I think it's purely a semantic point - we have to call it something and like I said, there are three large bands (roughly), and CBR removes it, and so might as well call it banding. It doesn't really matter what we call it. So I'm not sure why there's confusion - we've identified the problem you call where there's a light region, dark region, and then light region as something abnormal. Instead of using all those words, "banding" is easier because I can see three bands - you should be able to also if you step back a bit from the screen. There are posts/tutorials that talk about this kind of large scale banding using the same terminology. I mean I understand you don't want to call it banding but saying it's confusing is what I find confusing since I clearly defined what I mean by what.

Unfortunately, saying "start with the darks" doesn't help because I've tried every possible way I could think of to get darks that I and others thought would help. I've even gone away from SharpCap and tried EZCAP. Same result. The problem you've identified is something I'd love to see fixed but having banged hours and hours of my time on it and not gotten anywhere unless someone has some kind of an idea I've not tried that would be promising, I'm reduced to workarounds (which I've found is common in so many aspects of this hobby). From what I see, CBR fixes it and saves my $1000 camera from having abnormal gradients. This particular problem/banding isn't also present in my QHY247C using a pretty similar workflow (that's an OSC but other than that) and I'd say the only solution after spending countless hours on this would be a new QHY163M (which I've even thought about but there's always other stuff to buy).

I used to say that if I could get to 90-95% of what I consider the best astro images I'd be happy and I wanted to get into AP to do a combination of data driven and artistic approaches with neither taking precedence (some of my images have the most unique colour choices that I can see). From my own perspective, I've achieved that - one of my images (one where I did go crazy on the colours) did win an imaging contest so IMO there's some independent corroboration (and the company I was in was very good).

But of course we're not doing AP of the same image over and over again for anyone but ourselves and my view on it is to do my best to aesthetically match the images I admire and go beyond them in terms of doing something creative/novel. I'm definitely not into making pretty images purely for the depiction of reality as we know it - it is for taking that reality and adding some creative aspect and manipulation to it and this aspect or my desire to do it is never hidden. My processing workflow and methods are usually detailed in all images/web page/posts/etc. I'm not hiding anything from a complete perspective (i.e., some sites I only upload the images without even my name on it and others I am able to provide a lot of details). There's nothing I'm doing that's beyond standard PI based processing for the most part - almost all the images I've seen of the Squid *that I like/admire* have done the masking.

As far as what is real, this gets back to the art aspect of astrophotography (something I stated early on even in this thread "without regards to things like SNR"). You then said AP is 50% science, 50% art. It is definitely my choice to push up the the brightness in Squid and then use LHE to increase contrast in some regions but that's my choice to depict the Squid that way - someone else did the same (again on a dataset that was about 50 hours long) and it's an image I like a lot so I'm following their look and feel partly and then I'm trying to differentiate my image which is mostly creative. But prior to blowing up the brightness using masks I have an intermediate image that perhaps looks like what you might see in some others' images but it is not what I like.

So if we go with "do art and astrophotography" together and the desire to introduce some creativity beyond being "real" then we get here. I definitely am not hiding this from my very first posts - I've been talking about doing with art with AP. Think of all my final images as my view of what the object looks like, not what is produced purely from a data driven approach which I think would be boring.

Interestingly, this gets to the subjective nature of this hobby: I don't consider the Squid depiction of mine that I like to be "uniformly lit" at all - I think there is a lot of variation if you look closely, more light, less light, etc. You can see the lines and bands seen in other images that less bright and even my own initial image looks like what you produced but without the background gradients. I'm doing this because I know it'll go in the background of the Ha and S2 signal and when people have done this, their Squid gets "lost" which I don't find appealing. So I've chosen this approach to ensure it remains bright throughout no matter what I do in terms of its background. Nonetheless, this is one of my favourite depictions of the Squid. I also use my 12 year old daughter's opinion to compare and contrast different images processed differently and I usually pick what we both like.

Finally when I first post about a "final" image on various fora, I do talk about the object being depicted and my processing approach and especially if I went overboard with any of the choices I do mention it.

I enjoy this discussion but it has strayed from the main point IMO and I doubt we'll agree - I do believe that 50% science/50% art varies depends on image to image (and there's no 100% science here or 100% art here). There's no easy fix for the large scale banding or whatever you want to call it that I can see short of doing CBR and DBE which seems to be accepted by others - I'm just following Jon Rista's tutorial for NR in PI. So I tend up with this situation and then after that of course the processing I do involves a lot of art and so on but that's besides the point.

--Ram
Tubes: C925, SV70T, UC18, Tak FC100DF, FS128. Mounts: AVX; Paramount MyT. Glass: SV 0.8x, Tak 0.7x, 0.66x FR/FF. Cameras: QHY163M; QHY247C. Filters: Astrodon 5nm Ha, 3nm O3, S2. http://ram.org/ramblings/cosmos/equipment.html for a full list.
User avatar
ramdom
Posts: 13
Joined: Mon Sep 07, 2020 8:56 am
Location: Youngstown, NY USA
Contact:

Re: New to Startools - can you do better with this 53 hour O3 master light?

Post by ramdom »

Also you wrote:

"....banding and an - unrelated - calibration problem, and stretching it shows that..."

What exactly is the "- unrelated - calibration problem"? I mean what are you referring to exactly in the final image?

--Ram
Tubes: C925, SV70T, UC18, Tak FC100DF, FS128. Mounts: AVX; Paramount MyT. Glass: SV 0.8x, Tak 0.7x, 0.66x FR/FF. Cameras: QHY163M; QHY247C. Filters: Astrodon 5nm Ha, 3nm O3, S2. http://ram.org/ramblings/cosmos/equipment.html for a full list.
User avatar
admin
Site Admin
Posts: 3367
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: New to Startools - can you do better with this 53 hour O3 master light?

Post by admin »

ramdom wrote: Mon Sep 14, 2020 1:16 am So I'm not sure why there's confusion
My confusion stems from the fact that you can replicate the uneven lighting in all your light frames; it is a constant and can therefore be calibrated out. Yet you treat it as some unknowable quantity like the true banding in your image (which is indeed an unknowable quantity as it varies per frame). That is what I mean by "unrelated - calibration problem". It has nothing to do with banding (e.g. horizontal stripes that appear at random places in random intensities). They are two separate issues, with two separate causes, requiring two separate solutions.

You even say "my darks do have it but it is not as pronounced"; this means you have already isolated it and, thus, that it can be calibrated out. How do your dark flats look? It should show up in those too. How does your master flat look? It should show up in that too.
There's nothing I'm doing that's beyond standard PI based processing for the most part - almost all the images I've seen of the Squid *that I like/admire* have done the masking.
"Masking" just like "stretching" is obviously a very wide term and can mean many things; a mask is used in conjunction with some other operation.
Maybe I'm misunderstanding you, but as I understood your previous post, it sounded to me like you are manually creating a mask where you think signal is (still acceptable), and then selectively process this area to make it stand out (not acceptable in documentary photography).

All images that I have seen of the squid, show a non-uniform gas/emissions distribution (an outline) commensurate with the typical astrophysical processes and history of such an object. It is not (and should not be) more luminous or "dense" in the middle; it appears you are projecting what you think the object should look like versus what was actually recorded (and is representative of real life), which, if indeed the case, is not documentary photography.
You then said AP is 50% science, 50% art.
data driven and artistic approaches
The art builds on the science - it does not replace it. They are not two separate approaches, one complements the other - that is if you wish to maintain the underlying scientific integrity at all. If you do not, then that is certainly your prerogative, but I would feel quite cheated, even angry, if your image were chosen over mine in an astrophotography competition, if it were later revealed your introduced new signal at will, rather than enhanced what you actually recorded.
I enjoy this discussion but it has strayed from the main point IMO
The main point of our discussions is seeing how StarTools's approach to signal fidelity and retention is different (and beneficial), when compared to other software you may have used in the past. The premise, however, is that your datasets are clean, well calibrated and that signal - as recorded - is respected. (per the do's and dont's).
The main point is not to compare how well StarTools can work around issues in your dataset, nor how well it can be (ab)used to invent signal for you. StarTools can certainly do all those things, but that is not the purpose of the application, nor where it sets itself apart.
Ivo Jager
StarTools creator and astronomy enthusiast
User avatar
ramdom
Posts: 13
Joined: Mon Sep 07, 2020 8:56 am
Location: Youngstown, NY USA
Contact:

Re: New to Startools - can you do better with this 53 hour O3 master light?

Post by ramdom »

Here is my master dark where the large scale banding is present. Until we both agree on a different term for "unrelated calibrated problem" this is what I'm calling it. There are indeed three large bands. The small scale banding I've also defined. I don't see a problem once I've defined these two terms. We could call it "XXXYYY" and it would be fine as long as I agree but I don't understand the resistance to "large scale banding" since the problem is three large bands of different illumination and it is an accurate description, not to mention CBR fixes it.

http://ram.org/images/space/downloads/m ... 7g_32o.fit

It is not present in my master flat which is here: http://ram.org/images/space/downloads/2 ... 14_56.fits

I've subtracted the master dark from each of my frames but I'm still left with the large scale banding (as well as amp glow) I've tried many things but it hasn't worked - who wouldn't want clean perfect integrations but I've tried a whole bunch of things to no avail at least as far as this set up is concerned. Actually this discussion has illuminated something for me: this problem is in my darks so it is something very basic! It is not in my bias frames either (I did do those once). It only shows up with longer exposures (of darks or lights). I think I did a test once where I showed it gets worse the longer you go (it has also gotten worse with time) but again subtracting one from the other on an individual level isn't enough to remove it.

I don't think I've introduced new signal in terms of actually painting something on the image and any funny stuff I'm doing I'll fully explain when I post the final image. I've masked out of the background and just used the normal processing operations in PI to brighten out the Squid and it's really the LocalHistogramEqualisation that increases illumination all over and I'm using parametres I like (so within the Squid, which is where you concern is, I've not done specific masking, etc.). It's a mask that is just an ellipse that covers the Squid (like an oval) and you can tell other imagers do it all the time---that GAME script was written with that in mind. So yes, your understanding is correct, I'm using masks in the way you describe as I know many APers do using the GAME script in PI and my masking is very limited and not relevant here. I can get a very similar result without the masks at all and I can get similar results using range selection to create the masks. Many tutorials talk about using masks in this manner. In the end, this masking is another irrelevant issue since people who've done this Squid whose images I like insert one image into the other so what I did to the background wouldn't have mattered in this case, since I'm going to take out the stars, and then take just the nebula and the left side signal, and insert in the RGB star background as others have done, including this first example below. There's a nice tutorial on a forum on how to do this and someone did this with one of my images, take a tricolour HP image and mixed with a bicolour image to illustrate how it works (which wouldn't be possible I think without masks not to mention cutting and pasting - that is still AP as far as I am concerned).

I'm surprised you've not seen the images of Squid highly/evenly illuminated but here's a few (and they're easy to find) of the images whose aesthetic I like and I've been using as a *rough* guide (the "*rough*" part is important, I'm trying to create my own art here and distinguish from them too). The one by Jim Lindelien one isn't highly illuminated in the Squid but it is part of my mix of images that I like mainly for the background areas:

https://www.astrobin.com/7wcspr/ - this is one of my favourites so far - another 50+ hour integration
https://telescopius.com/pictures/view/1 ... ul_c_swift
https://telescopius.com/pictures/view/6 ... by-venzibo
https://telescopius.com/pictures/view/1 ... rry_wilson
https://telescopius.com/pictures/view/6 ... -lindelien

It is indeed a full on artistic choice to make the image so bright as I did so that's on me but you should be able to discern the lines and dark patches and such if you look closely enough which does correspond to what others have produced and if you look at the other examples I point to, there's blue all over the Squid in the first image - I don't think what I have is that different from what he has. I have my reasons and motivations for doing this as I've explained. I'm satisfied with my choices in this regard.

I am not saying I'm doing documentary photography BTW---that is NOT my goal at all. When I collect data, I'm doing AP observations. That data can be used for whatever purpose anyone wants. Once I start processing, including preprocessing, it is to render an aesthetically pleasing image that has maximal signal and minimal noise *within my workflow* as judged by me based on looking at dozens of other AP images of the same target.

As far as the WikiScience competition national finalist, it wasn't this image but there's nothing I hid. It wasn't an AP competition per se, but a contest on "imagery about the sciences" and it won in the general category where there were other AP images and my image beat out other AP images some of which were really good. So given my goals of art and science I think that's not a bad achievement. I don't see what you getting angry or upset would achieve had you taken part, since this contest at least recognised that making these images in science involves subjective choices (I make really beautiful images of scientific objects known as protein structures and it's the same - at the pretty picture stage, anything goes that is not intentionally devious since the data is there for the true science to be observed). I don't see any signal left over in the background once I do the CBR and DBE except what I have so these are accepted operations in AP. I also freely make my data available (upon request) so you can examine the data yourself to see what I did and didn't in terms of art.

As far as the calibration in general, your website on do's and don'ts say "as possible" a couple of times and "best of your abilities". That's where we are. This is the best as possible so far I've been able to do in terms of calibration.

I'm still going to also say that if I/we did solve this large scale banding/calibration problem through some miracle without buying a new camera, the resulting image won't be that different from what is produced by the CBR/DBE image and any improvements would be marginal. But I would be happy to be proven wrong in this regard.

As far as your last paragraph, then I'm sorry - when I posted this thread, I wanted to see if an expert ST user could make an image that *I subjectively* considered better and/or according to the metrics I agreed to (I understand the limitations of both the "I" and "subjectively" but others in this community have done this for me to convince me to do something new/different and I have done the same with others also, i.e., done my best to pay it forward). If that is not possible with a dataset calibrated to the best of my ability, then I'm sorry to have wasted your time. Thanks though - I appreciate it and having another chance to beat on this large scale banding/calibration problem.

--Ram
Tubes: C925, SV70T, UC18, Tak FC100DF, FS128. Mounts: AVX; Paramount MyT. Glass: SV 0.8x, Tak 0.7x, 0.66x FR/FF. Cameras: QHY163M; QHY247C. Filters: Astrodon 5nm Ha, 3nm O3, S2. http://ram.org/ramblings/cosmos/equipment.html for a full list.
User avatar
admin
Site Admin
Posts: 3367
Joined: Thu Dec 02, 2010 10:51 pm
Location: Melbourne
Contact:

Re: New to Startools - can you do better with this 53 hour O3 master light?

Post by admin »

ramdom wrote: Mon Sep 14, 2020 3:56 am I've subtracted the master dark from each of my frames but I'm still left with the large scale banding (as well as amp glow)
The "banding" in the master dark matches, for example aug20_ou4_00002, extremely well;
StarTools_208.jpg
StarTools_208.jpg (324.19 KiB) Viewed 3706 times
I cannot see residual "banding" after subtraction;
aug20_ou4_00002_bin25_mul625_substretch.jpg
aug20_ou4_00002_bin25_mul625_substretch.jpg (411.68 KiB) Viewed 3706 times
The rest of the unevenness look like typical issues that will be corrected by the flats.

Looking at the master dark, it appears the "banding" is simply thermal noise, caused by the electronics surrounding the sensor.
It is also a near perfect match for the residual "banding" in the final stack.

I'm wondering if it is maybe an (analog) gain issue; the actual residual real celestial signal seems extremely low in the individual frames for the exposure times quoted. I have to multiply the signal 625x to even start showing the background. (you may remember my initial question about 'adding' signal, as the signal is bunched up in the very lowest part of the dynamic range in the stack). Again, this is quite unexpected and something you will want to investigate if you cannot readily explain this by some exotic stacking setting. E.g. as it stands, you can, going by your individual frames and stacks, comfortably increase your exposure times (or gain) by a factor of 100x.

My theory right now, is that your gain is set extremely low, and any celestial signal coming in is throttled, meanwhile thermal noise and amp glow accumulate much faster in comparison (bypassing gain), comprising much of the final signal. Whether this scenario is plausible depends very much on the hardware and sensor - I know too little about the specifics.
I did a test once where I showed it gets worse the longer you go (it has also gotten worse with time) but again subtracting one from the other on an individual level isn't enough to remove it.
That's generally how darks work; they are dependent on exposure time and temperature.
I'm surprised you've not seen the images of Squid highly/evenly illuminated
None of these show the center of the lobes brighter than the edges though? (perhaps the Venzibo one is closest) I hope you understand the astrophysical processes that make it so? In general, seeing selectively processed/enhanced images makes me (and - I know - especially Juan Connejero) a little sad and I'm still hoping I'm just misunderstanding - let's keep it at that.
...I'm sorry to have wasted your time.
Not at all!
Thanks though - I appreciate it and having another chance to beat on this large scale banding/calibration problem.
Do let me know if the above (gain?) yields any clues!

Clear skies!
Ivo Jager
StarTools creator and astronomy enthusiast
Post Reply