How to do good bloom for HDR rendering

One thing that makes HDR rendering impressive is the bloom effect. In fact even if bloom is in a sense decoupled from HDR rendering, it is often confused with HDR. There are 3d engines out there that advertise bloom for HDR rendering, which is nonsense. You can have HDR rendering without bloom, and you can have bloom without HDR rendering. What makes the sweet, sweeter is the combination of the two. Actually if you present the average gamer HDR rendering without bloom, it will be hard for her to understand the difference between LDR and HDR rendering…

This means that you are going to need good blooming in your engine to really have that “WOW” thing coming out of peoples mouths. Before writing the Sylphis3D HDR rendering implementation I had read some articles about blooming but the results were never satisfactory. I’m going to present you here the method I used and that really makes a perfect bloom that is also faster to perform and uses less memory.

The usual algorithm for bloom is this :

  • Take the current render and downscale it to a reasonable size that is preferably a power of 2
  • Apply a bright-pass filter on the image to keep only high luminance values
  • Apply a Gaussian blur filter of small kernel size several times to get a good blur
  • Additively blend the resulting texture on the screen

The above algorithm is what you do in general for bloom but has a flaw. The flaw is in the blurring step. This blurring is done by applying a Gaussian blur filter. This Gaussian filter is decomposed to two passes: one horizontal and one vertical. This way the samples required for the filter is reduced. I’m not going to get into this since I consider this as something trivial.

The problem is the size of the kernel of this filter. Since we are doing this in real-time and on the GPU, we can’t have as many samples as we like. This results in small kernels that create very small blur. The common solution to this problem is to repeat the blur step several times in order to expand the blur.

brightpass.jpg
The unblurred
bloom texture
brightpass_5g.jpg
After 5 Gaussian
blurs of 5×5 kernel














In the above images we can see the texture that results after the bright-pass filter of the left, and on the right we see the blurred image of it after applying a Gaussian blur with a 5×5 kernel 5 times. This bloom texture is not what I consider a good result. The reasons are:

  • The image lost its intensity: The several blurs resulted in a fading of the luminance.
  • The size of the blur is very small : The resulting bloom is small and constrained at only a few pixels around the bright spots, when a very bright spot should give a wide glare and bloom.

The first that comes to mind is to solve this by increasing the kernel size.

brightpass_bigkernel.jpg
After one Gaussian
blur of 21×21 kernel
On the image to the left we can see the results of applying a Gaussian blur of 21×21 kernel size. The glare now has bigger and acceptable size, but the intensity is faded again. What we need is to keep some sharpness of the smaller kernels but also have the wide spread of the big kernels. We will do this by adding several blurred images of various kernel sizes.

multiblur.jpg
Adding multiple Gaussian
filters
























In the image above we can see the effect of 4 Gaussian blur filters of kernel sizes 5×5, 11×11, 21×21, 41×41 along with the result we get by adding them up. The result look as expected… very good. This is what we need.

At the moment you are probably starting to wonder: how are we going to do a 41×41 kernel in real-time, on todays hardware?! The answer comes next…

Implementation

The solution is as always: approximation. I will try to invert the problem. Think of it like this: I need to double the radius of the Gaussian blur kernel… wouldn’t that be like applying the same Gaussian filter on a half-sized texture and then double its size with bilinear filtering? Yeap… it would be a good approximation! Take a look at the results.

approx.jpg

The images on the left present the results of applying a real 41×41 kernel size and the approximation. The approximation was done by downscaling the original 128×128 texture to 64×64, then to 32×32 and finally to 16×16. Then that tiny texture was blurred with a 5×5 kernel and then rescaled to 128×128 using bilinear filtering. The results look very good and the approximation is very good.

So we got the results of applying a 41×41 kernel on a 128×128 texture (which is not doable with today’s hardware) by applying a 5×5 kernel on a 16×16 texture (that is on 256 pixels!)… very fast. The 21×21 filter is approximated by applying the 5×5 kernel on the 32×32 texture, the 11×11 filter by applying the 5×5 kernel on the 64×64 texture and the 5×5 by applying the 5×5 kernel on the 128×128 texture. This way we get the 4 blurred textures that we add to the screen.

Lets look at how this look on a real scene…

Example

Lets see an example of how the two methods perform on a real scene. Below is a tone-mapped frame rendered from the Sylphis3d game engine:

screen.jpg

The brightpass filter on that frame results in the next image:

screen_bp.jpg

The above image consists of the parts that will glow. Applying a the conventional blur filter 4 times results in the image below:

screen_bp_4g.jpg

Finally we add the blurred image on the original rendered frame and we get this result:

screen_bp_4g_final.jpg

Even if the above image gives a nice bloom effect, the glare is minimal, and does not give a powerful effect. So lets see how the proposed blurring filter works on the same image. Beginning with the same brightpass image we blur by the approximate Gaussian filters that was described earlier in the post. We get these images for virtual kernel sizes of 5×5, 11×11, 21×21 and 41×41:

screen_bp_5x5.jpg
screen_bp_11x11.jpg
screen_bp_21x21.jpg
screen_bp_41x41.jpg

So adding up all the above images we get the final blurred brightpass image:

screen_bp_addedl.jpg

Adding this image on the original rendering we get this result:

screen_final.jpg

I really think that the result speaks for itself! The blooming really got impressive and powerful. The fun thing is that we got this result by using less GPU horse power since we are blurring smaller textures!

I would also like to point out that this method gives very good results for LDR rendering, too. This is due to the fact that you can get wide bloom even from a constrained 0.0 to 1.0 range of colors, that would otherwise fade easily.

Hope you will find this useful and if you need any specific implementation details just ask!!

If you liked this article DIGG IT!!!
If you liked this article BUMP IT!!!

hdr, opengl, direct3d, bloom, post process, effects, 3d, graphics, sylphis3d

  • Hieu Hoang

    AWESOME!! can’t believe noone left a comment

    nothing else to add, i don’t know a single nut in this business..

  • http://harkal.sylphis3d.com Harry Kalogirou

    Thank you Hieu!

  • Sergey

    VERY VERY good article. Thanks for it!

    • http://www.facebook.com/profile.php?id=100003453116709 Selma

      I totally agree with you on this one, the more I’ve lkoeod at it the more I’ve hated the halos around the trees and just the overall fakeness of the photo. I think there is a time and place for the over-the-top HDR, but in my opinion it’s a fine line. I’ve gotten a lot of complements on which I thought was a little over processed at first.

    • http://www.compareautoinsur.com/ free auto insurance

      Thanks for taking the time to post. It’s lifted the level of debate

  • http://blogg.aftonbladet.se/9173/ Aftonbladhet

    Tnx nice blog love it. Rendering !!!

  • Upset

    No, no more bloom. It’s a horrible effect. That last screenshot is hideous compared to the unprocessed screenshot. :(

  • someone

    the last image is the usual “look mam, i know glowing!” Please do not overbloom images, they look so ugly.

    By the way, removing high luminance values has nothing to do with high-pass filtering – they are quite different things.

  • http://harkal.sylphis3d.com Harry Kalogirou

    to someone : lol.. who mentioned high-pass filtering?!? I wrote bright-pass..

    so I guess the high-pass think you talk about is something you JUST had to say!

    LOL!

  • Pingback: Thoughts Serializer

  • Will

    Nice article, clear and very informative. I’m busy implementing the blooming effect you just described and I’m using point sampled mipmaps to obtain downscaled images of the original texture. However, when trying to apply a 5×5 kernel on the 128×128, 64×64, 32×32 and 16×16 size mipmaps, artifacts start to appear (hard edges and squares).

    Two questions for you: - How do you prevent the artifacts to show up? and - Can you tell me what your 5×5 kernel looks like? (what’s the sigma value you’re using?)

  • http://harkal.sylphis3d.com Harry Kalogirou

    Maybe you are not using bilinear interpolation? I don’t know. For implementation details I would point you to the actual source of Sylphis3d, where the above algorithm is implemented…

  • http://www.panoramy.net GunXter

    Very nice article.thx

  • loriand

    I tried this approach but there is some problem, the final result is the glow is a little biased to the right of the screen so you get alot of a glow to the right of the object but not on the left.

  • http://www.racer.nl Ruud van Gaal

    Nice stuff, even in 2009 ;-) But is it that you resize the bloomed textures back to 128×128? I’ve implemented downsampling & blurring, so for example: 1280×1024 -> brightpass.cg -> 512×512 filter.cg from 512×512 -> bloom1texture (512×512) filter.cg from 512×512 -> bloom2texture (256×256) filter.cg from 256×256 -> bloom3texture (128×128)

    It does give aliasing trouble, but my Guassian filter is not really ok yet. Couldn’t find the 5×5 filter .fp file in your source as well, just separated hor/vert blur shaders.

  • dan

    thanx for the article. it answered some basic questions i had.

  • Bloom

    lol man your talking a lot of nonsense. You cant have Bloom without HDR cos bloom is relying on HDR. Bloom is a camera effect which comes from bright pixels which give part of their power to their neighbours. Since LDR has a max of 1.0 you can never say which pixels should bloom.

    Also your bloom is BY FAR to strong and has nothing to do with RL Bloom. Bloom is around VERY BRIGHT stuff and in your final picture every part is overbloomed. A correct bloom works in HDR Space and only with very bright parts and not the wall like in your image^^

  • WolfCoder

    You’re neglecting the fact that this method is an approximation to Bloom. I agree the bloom here is a little strong, but you can just raise the cutoff luminance, increase the spread, and reduce the brightness of the bright texture to adjust the effect.

    I’m personally modifying Bloom to actually make a scene look less realistic on purpose, I’m going for more dreamlike, vivid, and strange graphics and I’ve been playing around with this in prototypes.

  • http://jacoders.co.cc Raz0r

    Hey man, great article (I know, I’m late =p) I’ve gone ahead and implemented this approach to bloom in Jedi Academy (Based on iD Tech 3), and the results are great =] Unfortunately due to implementation constraints, I could only use 3 downscaled images, but using a 7×7 kernel, sampling between texels, and using GLLINEAR/GLLINEAR sampling, I’m left with a nice wide bloom at minimal performance cost. I’m also doing my downscaling based on the previously downscaled+blurred image, because it’s easier and requires less FBO’s. There are no artefacts, and I am really pleased with the result. I am downscaling my scene by .5 each image. Downscaling by .25 works good too, if needed. Thanks again =]

  • http://ndelahom.com/ iiiii

    How to do good bloom for HDR rendering Thoughts Serializer I was suggested this website by my cousin. I’m not sure whether this post is written by him as no one else know such detailed about my difficulty. You’re incredible! Thanks! your article about How to do good bloom for HDR rendering Thoughts SerializerBest Regards Veronica

    • http://www.facebook.com/profile.php?id=100003453255811 Roukia

      I’m not a fan of HDR. If implemented prerpoly it certainly looks nice. Especially in outdoor shots (As seen in your first screenshot).However, the effect is simply to much indoors. Everything looks like it’s illuminated by stadium lights, which takes away from the realism it strives to archive.For me the performance loss (Compared to the HL2 engine sans HDR) doesn’t justify the visual improvements at all. I think we have to wait for games/engines that are designed from the ground up with HDR in mind to see what the technology is really capable of.The demo itself falls a bit short as well. The audio commentary is a nice feature but I expected a bit more than just a tech demo. Not much I can say about the final product after finishing it.

  • Syahmi

    Hey Harry , do you use python Languages to create 3D game(The sample picture uses above) ?

  • http://kalogirou.net Charilaos Kalogirou

    I use C++ for the core engine code and Python for scripting.

    • Syahmi

      Thank you for replying , and OMG That is soo cool how u guys do that!!

  • http://www.paradisocrossfit.com/ cool articles to read

    Why viewers still make use of to read news papers when in this technological globe the whole thing is accessible on net?

  • nL

    Perfectiveness article. Thank you so much.

  • ALi

    very very nice article. thanks .

  • https://www.youtube.com/watch?v=yq18Kgh1ypY pure garcinia cambogia extract

    Thank you for the auspicious writeup. It in fact was a amusement account it.

    Look advanced to far added agreeable from you! However, how could we communicate?

  • http://ow.ly/mGf0g ow.ly/mGf0g

    All of the bartender have to have little weight, fiber or perhaps protein, which sometimes stall digestion of food and additionally bring about abdomen misery.

  • Pingback: Fable Anniversary trailer looks back with bloom-colored glasses | One Business – One Business