Balthazar

joined 1 year ago
[–] [email protected] 4 points 1 day ago

Right in the dead center of town!

[–] [email protected] 3 points 1 day ago

Oh, a dual narrowband filter in front of an RGB sensor. Oh, that makes sense.

[–] [email protected] 1 points 1 day ago (2 children)

Is the [OIII] filter really doing anything for you? I think it's really only showing continuum, in which case you'd be better served with a wider bandpass.

[–] [email protected] 128 points 2 days ago (2 children)

Also well known for foiling evil plots while wearing a fedora.

[–] [email protected] 19 points 2 days ago

Didn't pay for the DLC.

[–] [email protected] 59 points 3 days ago

I don't think it's lack of spine. I think he's deliberately sold his soul for power and influence and money.

[–] [email protected] 2 points 4 days ago

No! He can't release his healthcare plan until after his taxes, and those are being audited.

[–] [email protected] 1 points 4 days ago

"Slammed". Everyone take a shot.

[–] [email protected] 2 points 1 week ago

Is this Jeremy Kubica, noted asteroid finder?!

[–] [email protected] 23 points 1 week ago

The king is dead. 😥

[–] [email protected] 7 points 1 week ago

And Incredaboy!

 

I love seeing the astro images posted here, but may I share an algorithm for making them even more beautiful?

Most astro images are created from separate red, green and blue images taken with electronic detectors (whether using classic BVR filters in an attempt to replicate what the eye might see, or some other combination in a "false color" image). There are two big problems that are common with the images created in this way (even by professionals).

The first is in the choice of stretch: how brightness on the detector maps to brightness on the displayed image. Most choose a linear or a logarithmic stretch. A linear stretch brings out fine detail at the faint end, but can leave the viewer ignorant of details at the bright end. A logarithmic stretch allows you to bring out details at the bright end, but not the faint end. Instead of these, choose an asinh (inverse hyperbolic sine) stretch, which is able to bring out both the faint and bright features. It scales linearly at the faint end and logorithmically at the bright end, giving you the best of both worlds.

The second is in the handling of saturation: how to display pixels that are too bright for the chosen stretch. Most apply the stretch separately in the red, green and blue channels. This makes the cores of bright objects appear as white in the color image, while they are surrounded by a halo that is more appropriate to the actual color of the object. The color of a pixel should instead be set by considering all of the channels together. This way, bright objects will have a uniform color, regardless of whether the stretch has been saturated in any of the channels.

See here for a direct comparison between the classic approach and this (not really) new algorithm on the old Hubble Deep Field.

If you would like to adopt this algorithm for your own work, there is a python implementation that you might find useful.

 

... researchers noted the similarities between the game and the real-world pandemics. Both had an immediate impact on dense urban areas, which limited the effectiveness of containment procedures in stopping the spread of disease, while air travel, like fast travel, allowed infections to spread across large parts of the world with ease. Lofgren compared the in-game "first responders", many of whom contracted Corrupted Blood when they attempted to heal others, to healthcare workers that were overrun with COVID-19 patients and became infected themselves. While a direct analogue was not made to griefers [players who engage in bad faith multiplayer game tactics], meanwhile, Lofgren also acknowledged individuals who contracted the COVID-19 virus but chose not to quarantine, thus infecting others through negligence.

view more: next ›