For example, say a bright star is slightly blue in colour. Without any sky glow, it would register as RGB 0.8, 0.8, 0.95. Sky glow gives an added 0.3 in blue. This will put the star colour on sensor as RGB 0.8, 0.8, 1.0, since the maximum value for a pixel is 1.0.
When the sky glow is removed during background extraction, the value 0.2 will be subtracted, and we end up with a star's RGB values of 0.8, 0.8, 0.7. The star core has suddenly turned yellow. However, away from the core, the star will still be blue, since these areas weren't over exposed. Therefore the need to correct star colours.
During normal stretching of an image, most of the bright stars will be maximised, and have a final colour of RGB 1.0, 1.0, 1.0. However, if masked stretch is used, the stars will not be saturated, and will have a funny looking core.
Here's an example of an unstretched star, with (left) and without (right) colour repair.
When should star repair be implemented in the workflow?
Of course, before any stretch is applied, the colour must be corrected. However, there is one earlier instance where star colour matters in processing, and that is during colour calibration.
Colour calibration tries to set a white point by looking at all the stars in an image, and apply a calibration scheme that is determined by the average star colour. When the star colour is wrong due to over exposure and background extraction, the white balance after colour calibration will be off. Therefore, there is an argument for applying HSV repair prior to colour calibration. The workflow then becomes as follows:
- cropping of edges
- background extraction (ABE or DBE)
- background neutralisation
- HSV repair
- colour calibration