Archive for the ‘Demo Reel’ Category

Working on the environment

Tuesday, March 29th, 2011

Here's a small update. Rendered with prman using my shaders and tools from within Houdini. I painted all the textures using Mari. The environment map used on this scene is a ptex env map. It's used in both the occlusion and reflection calculations. Everything is still being rendered out as a single pass at this point, with all blending, etc. being calculated in the shader.

I'm testing some different environment lighting, and textures for the environment. Right now I'm pretty happy with the desert environment.

train render

Shaders & Environment

Monday, December 13th, 2010

I've been working on a procedural snow shader for some of the objects in my reel. It involves two layers. The base is texture map driven, oren-nayar lighting model with very slightly glossed fresnel reflections. The other is snow, which uses the Buratti diffuse model, has sparkle, rim light, a little fresnel in the color, and a touch of reflection weighted by glancing angles. I still plan on experimenting with different noises to drive the displacement of the snow, and roughen-up the edge dividing the areas with & without snow.

In the image above everything uses the same shader. There's a single spotlight, and a blurry ptex env-cube for the env map. No occlusion, Gi, etc, which I think could help it, but it wasn't necessary.

Demo Reel Update

Monday, November 15th, 2010

This is a small update. It's been a very long time coming, and this is just a little piece of what I've completed.

I purchased a license for Mari the moment it became available on Linux, and have been using it extensively ever since. It's extremely nice to not rely on Photoshop or Windows for a single thing. Not that I dislike either, but I prefer to work without VM's, wine or dual-booting.

I've since painted the locomotive engine (seen above). This model is composed of 30 separate OBJ files with 1 to 6 UV patches each. In total, there are over 90 texture maps ranging from 8k to 2k on this single model. This is a small test render, but the detail holds up at very close range. Please understand that this is 100% untouched output from prman (besides minor levels adjustment), and the final result will be composited from many different layers including occlusion, reflection occlusion, etc. You can expect it to look much more realistic and graded/matched to an environment. I don't want to jump ahead of myself, so this is just to see what my textures look like. Since it's mostly metal you can expect it looks quite different in different situations.

This model is also fully rigged and animated using Houdini's CHOP's. All the parts around the wheels (including the wheels) animate as the train moves down the tracks. The render above took roughly 15 mins with 2 bounce radiosity from an environment map. I'm still using proprietary tools and shaders that I've written for this project in Houdini / Prman 15.2.

Hopefully I'll have something more interesting soon. As of right now I've completed the texturing on the locomotive, fuel tender, 7 boxcars, water cistern, train station, x2 train signals, a small building, and x2 grain silos.

Building Tools

Monday, February 1st, 2010

Fence & Train Tracks

The past months have been very productive. I've built and cleaned up all the geometry for my shot, laid UV's, and started texturing. I've also built a few HDA's to aid in my set layout. For example, I have a nifty fence HDA that will lay down some farm-fence like nobody's business. There's also a train tracks HDA that will lay down tracks over any terrain I throw at it. There isn't much to say about either of those as they're fairly standard HDA's with the copy SOP at both of their core.

Env Light

Before I got too far into texture painting I decided to flesh out some rendering tools. The first task was to tidy up my envlight HDA, which I originally started for this project but it got sidelined a while ago. My main issue was that I used home-grown python scripts to filter my RIB when I should've been using real RiFilters. The end result is a more reliable, better functioning tool. Python RiFilters are also extermely simple to build.

An example Rif that replaces surface shaders.

 1 #!/usr/bin/python
 3 #######################################
 4 ## ShaderRif RiFilter
 5 ## Alan Warren – 1/30/2010
 7 ## replaces all instances of RiSurface
 8 ######################################
10 import sys
11 import re
12 import prman
14 # __faceindex
15 INDEX = 0
17 class ShaderRif(prman.Rif):
18     def __init__(self, ri, shader, file, chan, maxvar, samp):
20         self.shader = shader
21         self.file = file
22         self.chan = chan
23         self.maxvar = float(maxvar)
24         self.samp = float(samp)
26         prman.Rif.__init__(self, ri)
28     def Surface(self, shaderPath, args):
30         args = {}
31         args['string filename'] = (self.file,)
32         args['string displaychannels'] = (self.chan)
33         args['float samples'] = (self.samp,)
34         args['float maxvariation'] = (self.maxvar,)
35         args['varying float __faceindex'] = INDEX
37         self.m_ri.Surface( self.shader, args)
39 if __name__ =='__main__':
40     if len(sys.argv) == 8:
41         infile = sys.argv[1]
42         outfile = sys.argv[2]
43         shad = sys.argv[3]
44         filename = sys.argv[4]
45         displaychannels = sys.argv[5]
46         var = sys.argv[6]
47         samples = sys.argv[7]
48         prman.Init(["-catrib", outfile, "-progress"])
49         ri = prman.Ri()
50         rif1 = ShaderRif( ri, shad, filename, displaychannels, var, samples )
51         prman.RifInit([rif1])
52         ri.Begin(ri.RENDER)
53         prman.ParseFile(infile)
54         ri.End()
55     else:
56         sys.exit(0)


PRMan 15 supports Disney's new file format "PTEX". However, there are no 3D paint apps out there that have fully implemented the file format. Still, there is very good reason to use .ptex in prman even without 3rd party paint apps. The reason being that PRMan allows you to bake surface data into a .ptex map and then call it from a shader just as you would a pointcloud.

The good thing is ptex maps more closely resemble their raytraced counterpart then pointcloud based solutions. The HDA I created bakes raytraced occlusion or arbitrary data into a ptex map that I can then render at almost real-time speed. (with 16 threads).

Workflow breakdown:

  • Setup an AttribCreate SOP to define "__handleid" as type "detail" attribute on geometry being baked. I also use the Attribute SOP to push this attrib into the RIB stream.
  • Set your camera to "orthographic" projection.
  • The first node "rib1″ specifies the necessary AOV's and injects culling, dicing and stitching attributes.
  • The following nodes are only used when I want standard occlusion, which is built into the HDA. It writes a shader to disk, compiles it, then uses an RiFilter to inject it at the mouth of the render using prman's python interface.
  • The next two Shell ROP's generate the .fed file, which contains face / edge connectivity data. Pixar provides an RiFilter that will do this for you. However, the Rif doesn't accept a location to place the .fed file so we're left with moving it ourselves. It goes in $HFS by default, so I had to make that dir writable as well. Inside the HDA's python module I import subprocess to run this command, which takes care of moving the file to my $JOB directory.

    cmd = 'find ' + hfs + ' -name "*.fed" -print 2> /dev/null -exec mv {} ' + storage + ' \;'
  • Finally I execute ptexmake to generate a floating point .ptex map.

I've also setup the HDA to take arbitrary data and bake it into a .ptex map. This task requires help from a surface shader setup to both bake & render ptex files. There's really nothing fancy involved with this and it's covered well in the docs.

The following is a few images from these tests. You can see the ptex map contains all of my surface's illumination, specular, bump and indirect illumination information. The last may be a bit hard to tell since there's not much in the form of an environment in this test scene.

The red parts are from the inside of the bell (not shown)

A render of the bell from the front of my locomotive.

This render had low sampling, but it rendered in 0.1 seconds.

I also thought it might be useful to import indirect illumination from my envlight HDA and bake it out into a ptex map. My surface shader "aw_surface" already imports this data for aov's so very few modifications were necessary. (If you're interested, I import the data from my environment light and bake it using the bake3d() shadeop. Then I use ptexture() shadeop to render the map in a second pass.) The only downside is I'm baking environment occlusion or radiosity into an organized pointcloud first, where raytracing would have it's obvious advantages. It's on my "to-do" list to build these various raytracing functions into my surface shader, but for now this is pretty cool as it makes what would be a long rendering effect like triple bounce colorbleeding nearly real-time.

An example with environment mapped occlusion.

And a render. Also 0.1 seconds.


Saturday, January 23rd, 2010

  • I've been working hard on my demo reel during pretty much all of my free time lately. It's a very large undertaking, so it still feels like I'm staring at the foot of a mountain. My end goal is to render two shots with my steam engine locomotive. The first will be a very close up shot of it leaving the station, while the second will focus on integrating the locomotive into a larger environment.

    The locomotive model was purchased from Turbo Squid, but it was one solid mesh so it took quite a while to separate each and every little piece and UV map them all. The locomotive also lacked it's Tender, so I have modeled this part myself. Lucky for me there's a Frisco 1527 Locomotive (Photo courtesy Bill Allen) less then 5 miles from my house at Spring Hill Park, and it has it's trusty Tender sitting idly behind it.

    I'm also going to be using some of Alvaro Luna Bautista's models from the popular lighting challenge "The local train" hosted by Jeremy Birn on CGTalk. The gallery for that challenge is HERE.

  • Technical Goals

    • Render tons of steam using either prman 15′s new multi-scattering volumes, or Mantra's standard volumes with animated displacements. Maya fluids are also an option.
    • Every texture map in the scene will be baked and converted to prman 15′s new PTEX format for the highest level of efficiency and quality.
    • Use non-standard lighting conditions. (low light, snow, etc.)
    • All textures will be filtered using filter regions to prevent any aliasing. New slim templates will be created to support this.
    • I would like to integrate spherical harmonic environment lighting into Houdini's compositor to help save me time when it comes to finalizing the lighting on my reel.
    • Final Renders will be in OpenEXR format @ 2k resolution.