Wired up a Half-Lambertian shader. It's a nice technique for simulation slight subsurface light scattering or very rough objects where light energy would be transmitted from it's hit location to nearby matter. Nice soft fall-off made famous in this Valve paper.
I've uploaded a preset for this material. Note that this material uses a vector share node (I've commented it) to calculate shading; not a light.
Sit, Ubu, sit! Good dog.
Many game engines are using deferred rendering these days as opposed to forward rendering. In addition, many movie shots have been authored with deferred techniques and then assembled and shaded in Nuke. The concept for both real-time engines and offline renderers is basically the same; to encode 3d data into 2d space (buffers) and then solve for the lighting/shading as a post process. If you are interested in lighting/shading in post, check out the Postlight tool by Andy Nicholas.
This deferred texture mapping test is a similar idea and mimics this tool by RevisionFX. For offline rendering, this additional UV vector pass can save re-rendering an image/animation by allowing me to swap textures after rendering is complete.
The Component Parser is the compositing node used to map the texture to the UV vector data. The variable a1 = horizontal pixel count of the image and variable a2 = the vertical pixel count.
Objects can be easily textured in post. Swapping textures is real-time without the need to re-render.
This is a great cheat for low cost 'dynamic' lights. If you bake lightmaps in passes, this is a very easy effect to author.
Essentially I combine two lightmaps; adding the dynamic map over the base map. For this demo, I do exactly that. In practical application, you would not want to double your lightmap textures space for a whole level. Instead, create a pass where duplicate chunks of localized geometry will hold low resolution dynamic maps (or even make a larger texture sheet for all the dynamic maps in your level, a 512x512 will do) and bake to a new set of UVs.
Using the toon shaders for illustrative renders. I made some line art, burned some screens, & printed the kiddos some cool custom T-shirts!
Displacement test with the 3delight renderer.
There is a lot of flexibilty, production time-savings, and specific look development which can be gained by building lightmaps (for realtime engines) in passes. Using mental ray, these passes are fairly easy to setup and bake. However, while final gathering (FG) works great (result-wise) for radiosity sampling, it is only single threaded when used for rendermapping. Seeing as how FG is a screen space method for sampling, I don't know if this can be called a bug (definitely an oversight). This limitation is made worse when several computers are used via satellite rendering to generate a single lightmap (or lightmap pass). The other systems sit idle, while one machine (using one core) calculates the FG prepass.
To avoid this production slow down and get your systems rendering with full CPU power, an environment sampling shader can do the work FG would normally calculate. Using mental ray's ambient occlusion shader with the below parameters, I generated the environment sampling with satellite rendering using 100% of 24 cores.
Use this sampling method with interactive IBL light domes for a fast and iterative lighting solution.
Here is the collection I use the most; feel free to download the .zip file. I took the liberty of giving the IES files artist friendly names. Using IES profiles to scatter indirect light energy into a scene gives a certain legitimacy to the render; even if the profile isn't used for light scatter effects directly. You can accomplish indirect scatter by setting up photon only lights with an IES profiles.