Lens Blurs and Variable Blurs

I've seen some interest in blurs lately on the internet, particularly ones that help give a nice look like the computer generated imagery may have been captured by analog equipment.

One of my favorite blurs for this is a variable blur by "iaian7", which uses Core Image, and was posted here:


I like this one quite a bit because it avoids transparency issues that many similar variable blurs have (including some I've shown). This is one of the reasons the blur is slow. While a radial gradient is used in the example above, any kind of greyscale mask or depth channel can be used. It's particularly nice to use a linear gradient for tiltshift effects with this one. That said, this is basically non-realtime.

Relatedly, Noise Industries has some blurs that achieve similar types of function, and which are available with their free download package.


In some cases there are some patches in this download that provide some more function (like chromatic aberration), and some run faster than the Core Image sample above, though they may appear to have a bit less accurate quality. This may be in the eye of the beholder; I haven't used the NI patches extensively.

Some other fast and not that accurate twists are to use CIRadialBlur, to create a kind of blur vignetting (discussed in this blog previously I believe), or to use CI Blend With Mask along with a blur and appropriate greyscale image (also displayed in a recent post).

Another really quick, not so accurate take on blur mapping is present here, in the Quartz Composer Patch : Blur Mapping example :


This one shows some very visible distinct image planes because of the nature of the blur algorithm, but it's also exceedingly fast. It's not a horrible thing to look at to understand some principles, and perhaps to improve upon.

Christopher Wright shows how to create a nice DOF effect by using a Texture Buffer, while also reading Depth Buffer, here:


This is one I've used quite a bit. The blur can be tweaked easily, and even replaced with different algorithms (like a gaussian kernel).

Relatedly (again!), I've recently seen a GLSL DOF effect that is similar in implementation. In this case, DOF is being generated by running two fragment shaders concurrently; one to generate color/material, and the other to generate the depth buffer. Both images are then used so that the GLSL DOF program can work. There are two variants available, Raytrace_Bokeh.qtz, and Raytrace_Bokeh20120227.qtz:


This is by no means meant to be exhaustive; I'm sure I'm forgetting something great (and thinking of some accumulation/feedback based methods at the moment that I may touch on in the future).