Building Tools

Fence & Train Tracks

The past months have been very productive. I've built and cleaned up all the geometry for my shot, laid UV's, and started texturing. I've also built a few HDA's to aid in my set layout. For example, I have a nifty fence HDA that will lay down some farm-fence like nobody's business. There's also a train tracks HDA that will lay down tracks over any terrain I throw at it. There isn't much to say about either of those as they're fairly standard HDA's with the copy SOP at both of their core.

Env Light

Before I got too far into texture painting I decided to flesh out some rendering tools. The first task was to tidy up my envlight HDA, which I originally started for this project but it got sidelined a while ago. My main issue was that I used home-grown python scripts to filter my RIB when I should've been using real RiFilters. The end result is a more reliable, better functioning tool. Python RiFilters are also extermely simple to build.

An example Rif that replaces surface shaders.

 1 #!/usr/bin/python
 3 #######################################
 4 ## ShaderRif RiFilter
 5 ## Alan Warren – 1/30/2010
 7 ## replaces all instances of RiSurface
 8 ######################################
10 import sys
11 import re
12 import prman
14 # __faceindex
15 INDEX = 0
17 class ShaderRif(prman.Rif):
18     def __init__(self, ri, shader, file, chan, maxvar, samp):
20         self.shader = shader
21         self.file = file
22         self.chan = chan
23         self.maxvar = float(maxvar)
24         self.samp = float(samp)
26         prman.Rif.__init__(self, ri)
28     def Surface(self, shaderPath, args):
30         args = {}
31         args['string filename'] = (self.file,)
32         args['string displaychannels'] = (self.chan)
33         args['float samples'] = (self.samp,)
34         args['float maxvariation'] = (self.maxvar,)
35         args['varying float __faceindex'] = INDEX
37         self.m_ri.Surface( self.shader, args)
39 if __name__ =='__main__':
40     if len(sys.argv) == 8:
41         infile = sys.argv[1]
42         outfile = sys.argv[2]
43         shad = sys.argv[3]
44         filename = sys.argv[4]
45         displaychannels = sys.argv[5]
46         var = sys.argv[6]
47         samples = sys.argv[7]
48         prman.Init(["-catrib", outfile, "-progress"])
49         ri = prman.Ri()
50         rif1 = ShaderRif( ri, shad, filename, displaychannels, var, samples )
51         prman.RifInit([rif1])
52         ri.Begin(ri.RENDER)
53         prman.ParseFile(infile)
54         ri.End()
55     else:
56         sys.exit(0)


PRMan 15 supports Disney's new file format "PTEX". However, there are no 3D paint apps out there that have fully implemented the file format. Still, there is very good reason to use .ptex in prman even without 3rd party paint apps. The reason being that PRMan allows you to bake surface data into a .ptex map and then call it from a shader just as you would a pointcloud.

The good thing is ptex maps more closely resemble their raytraced counterpart then pointcloud based solutions. The HDA I created bakes raytraced occlusion or arbitrary data into a ptex map that I can then render at almost real-time speed. (with 16 threads).

Workflow breakdown:

  • Setup an AttribCreate SOP to define "__handleid" as type "detail" attribute on geometry being baked. I also use the Attribute SOP to push this attrib into the RIB stream.
  • Set your camera to "orthographic" projection.
  • The first node "rib1″ specifies the necessary AOV's and injects culling, dicing and stitching attributes.
  • The following nodes are only used when I want standard occlusion, which is built into the HDA. It writes a shader to disk, compiles it, then uses an RiFilter to inject it at the mouth of the render using prman's python interface.
  • The next two Shell ROP's generate the .fed file, which contains face / edge connectivity data. Pixar provides an RiFilter that will do this for you. However, the Rif doesn't accept a location to place the .fed file so we're left with moving it ourselves. It goes in $HFS by default, so I had to make that dir writable as well. Inside the HDA's python module I import subprocess to run this command, which takes care of moving the file to my $JOB directory.

    cmd = 'find ' + hfs + ' -name "*.fed" -print 2> /dev/null -exec mv {} ' + storage + ' \;'
  • Finally I execute ptexmake to generate a floating point .ptex map.

I've also setup the HDA to take arbitrary data and bake it into a .ptex map. This task requires help from a surface shader setup to both bake & render ptex files. There's really nothing fancy involved with this and it's covered well in the docs.

The following is a few images from these tests. You can see the ptex map contains all of my surface's illumination, specular, bump and indirect illumination information. The last may be a bit hard to tell since there's not much in the form of an environment in this test scene.

The red parts are from the inside of the bell (not shown)

A render of the bell from the front of my locomotive.

This render had low sampling, but it rendered in 0.1 seconds.

I also thought it might be useful to import indirect illumination from my envlight HDA and bake it out into a ptex map. My surface shader "aw_surface" already imports this data for aov's so very few modifications were necessary. (If you're interested, I import the data from my environment light and bake it using the bake3d() shadeop. Then I use ptexture() shadeop to render the map in a second pass.) The only downside is I'm baking environment occlusion or radiosity into an organized pointcloud first, where raytracing would have it's obvious advantages. It's on my "to-do" list to build these various raytracing functions into my surface shader, but for now this is pretty cool as it makes what would be a long rendering effect like triple bounce colorbleeding nearly real-time.

An example with environment mapped occlusion.

And a render. Also 0.1 seconds.

Leave a Reply