Final Project: Lava / Molten Material

Project Goal

Our goal was to render a lava flow or molten steel. We wanted to correctly calculate the black-body radiative characteristics that cause their surfaces to be emmissive as well as capture the chaotic and organic appearance of the molten materials. This surface detail lends itself to be modeled through procedural methods; we also wanted to use a procedural method that could be parameterized over time to create interesting animations. Our original proposal can be found here

Implementation

A true black body is an ideal diffuse radiator -- it emits light and absorbs all incoming light. Lava and other molten metals do emit an approximate black body spectrum. The black body power distribution function at each wavelength is specified completely by the temperature of the material.

Therefore, to determine the emitted light from a point on a surface of a given temperature, we integrated the product of the black body power distribution function and the XYZ response functions over the visible spectrum to end up with an XYZ color representation of the spectrum. [5] This XYZ color was converted to RGB for use as the emitted light of an area light.

One of the goals of the project was to reproduce the unique surface detail of lava. Lava's surface has both a chaotic and organic flavor to it, which can be difficult to capture with a hand-made texture map. To this end we wanted to employ a technique that allowed us some explicit control while still imparting that organic feeling. After some investigation, procedural texturing techniques seemed to be able to fulfill that need. [1] However, there are many different types of procedural texturing, with each family able to produce textures of varying aesthetic quality, but we needed a procedural basis that had the ability to at least reasonably reproduce the surface texture of lava. After some further investigation it became apparent that Worley's cellular texturing functions and Perlin's and Neyret's flow noise techniques were complementary tools capable of what we needed. [2, 3, 4]

We chose these two types of procedural texturing because the surface of most molten materials has two distinct qualities to it. First, some areas of the surface are quite hot while other regions have cooled considerably. The hot regions should obviously have a different visual quality to them than the cool regions and cellular texturing provided just this capability. Second, the surface of most molten materials is usually not static, it has some sort of flowing motion to it. To this end flow noise and pseudo-advection provide the capability of a time-dependent motion in the surface texture, capturing the swirl and flow of the lava. Both these techniques will now be described in greater detail.

Worley's cellular texturing is a technique based on the idea of forming different basis functions using feature points scattered through space. The values of these basis functions evaluated at any sample point can be used for color, normal displacement, or many other types of desired values. The basis functions themselved define certain metrics related to a corresponding distance from a given sample point. For instance, the F1 basis function defines the distance from the sample point to the closest feature point, F2 defines the distance from the sample point to the second closest feature point and so on. As the sample point moves through space these basis functions vary continuously, but have a discontinuity in their derivatives when the sample point is equidistant from two feature points. This results in a texture map that is composed of different coherent regions, as evidenced in the example image of F1, F2, and F3 below:

http://www.stanford.edu/~lthendri/cs348b/final/worley_basis.png

The technique produces much more interesting results when these basis functions are combined in different ways. For instance, using a linear combination of F1, F2, F3 and F4 produces the different textures in the left image, while using a fractal combination of a single basis function produces the different textures in the right image:

http://www.stanford.edu/~lthendri/cs348b/final/worley_linear.png http://www.stanford.edu/~lthendri/cs348b/final/worley_fractal.png

Thus, with this cellular texturing technique we are able to use its values to produce hot and cold regions of interesing visual quality on the surface of the lava.

The next procedural texturing technique used is Perlin's and Neyret's flow noise, which is based on modifications to Perlin's classic noise function. In the traditional noise function it is typical for the underlying gradients to be statically defined; however, in this extension the gradients instead have a dependence on time, as time progresses each gradient is continuously rotated to impart a swirling quality to the noise. Next, summing multiple levels of this "rotated" noise together makes the rotations proportional to spatial frequency such that finer noise is rotated faster. However, swirling is not the only effect noticed in flowing fluids. Such materials also exhibit advection, whereby enduring features are stretched but new features are not. This effect is modeled through a "structure memory", defined so that passive structures are completely advected but active structures are unstretched. Perlin and Neyret actually use lava specifically as an illustration of this technique:

http://www.stanford.edu/~lthendri/cs348b/final/lava_flow.png http://www.stanford.edu/~lthendri/cs348b/final/lava_advect.png


With these two techniques in place we needed to decide how to combine them in other to use both their qualities at the same time. After some experimentation we settled on a simple linear blend which seemed to give the best results while still allowing us control over which method dominated. Lastly, we needed some way to decided on all the parameters to these two methods such that a texture resembling lava was produced. Since there were quite a few parameters and rendering each different combination through pbrt was infeasible due to time constraints (especially since we had planned on rendering more than one scene, each with its own flavor) we built a standalone app that used the same code base but allowed us to tweak the pattern in real-time.

http://www.stanford.edu/~lthendri/cs348b/final/tweaker.png

The end result of this was to feed the value given by the procedural texture to our black body radiation routines. The black body routines then use these values to calculate the temperature (and thus emitted power) at any point on the surface. However, by design there will be some "cool" regions on the surface where no light is emitted, and it would not do to have these areas simply shaded black. We noted that in the reference image as the lava cooled the cold regions become reflective rather than emissive, and that this reflection has both specular and diffuse components along with looking quite rough on a macro-level. To model this we "ramp" from emission to reflection, where the coolest regions of the surface are shaded completely using a Blinn microfacet distribution with a fractional brownian motion bumpmap. Together these two techniques give the cool regions a rocky appearance.

Our Lava Light was based off the Area Light implementation in PBRT. The default Area Light assumes that the light source will emit light uniformly across its area, so it samples the shape uniformly by area - choosing larger polygons more often in proportion to their areas. Because our light source does not emit light uniformly by area, we wanted to sample according to a different distribution that would be preferential to triangles which emitted more power (even if they were small).

To compute a sampling distribution, we extended the Shape Set to compute a Monte Carlo estimate of the power emitted per triangle. Then we built a CDF based on this and changed the Sample method to choose a triangle based on the CDF of emitted power of a triangle rather than the area of the triangle, sampling uniformly across the selected triangle to choose the final sample point.

In the standard Shape Set, the base PDF is 1 / Total Area, because they have sampled uniformly by area. This is converted to a distribution per solid angle by using the dw/dA relationship. In order to return a proper PDF for our samples, we wanted to start with a base PDF of 1 / Total Power because we have sampled uniformly by power.

We need to convert that to an area distribution in order to continue using the dw/dA relationship to change variables to a solid angle distribution. The relationship dPower/dA is constant over a particular triangle, and is defined by Power of Triangle / Area of Triangle. So, given a particular triangle, we can compute dPower/dA and multiply by 1 / Total Power to get a weighting by area; then use the relationship dw/dA to convert to solid angle.

Here are some results in a small test scene. On the left is the initial image, sampling the light uniformly by area. On the right is the importance sampled image, which does exhibit less variance.

sample_by_area (2 samples/pixel, 32 light samples)

sample_by_power (2 samples/pixel, 32 light samples)

http://www.stanford.edu/~lthendri/cs348b/final/sample_by_area.png

http://www.stanford.edu/~lthendri/cs348b/final/sample_by_power.png

Final Images

http://www.stanford.edu/~lthendri/cs348b/final/mountain_inspire.png

http://www.stanford.edu/~lthendri/cs348b/final/mountain.png

http://www.stanford.edu/~lthendri/cs348b/final/pour_inspire.png

http://www.stanford.edu/~lthendri/cs348b/final/pour.png

http://www.stanford.edu/~lthendri/cs348b/final/blobs_inspire.png

http://www.stanford.edu/~lthendri/cs348b/final/blobs.png

http://www.stanford.edu/~lthendri/cs348b/final/lava.avi

Challenges / Interesting Investigations

Our initial implementation of the black body code had a bug which made the emitted power far far too low for what PBRT was expecting. This led to a wide range of investigations trying to find out what techniques were used for color representation of black body radiation.

First, there are many black body radiation color-temperature gradients on the web. Here is an example:
http://www.stanford.edu/~lthendri/cs348b/final/blackbodyglow.png

However, these gradients are created by "self-normalizing" the color returned by the black body spectrum integration -- divide each of the XYZ color values by the sum of the XYZ values for a particular color. This means that you can represent the colors nicely, but it does not represent the intensity of the color properly at all. The emitted power of a black body increases as temperature4, so higher temperatures are orders of magnitude brighter than lower temperatures, and the color gradients do not represent this. For instance, 1500K should be bright yellow, but the color gradients expect it to be orange.

The paper "Physically Based Modeling and Animation of Fire" by Nguyen, Fedkiw, Jensen uses the black body radiation method as well, and then uses a von Kries transform to approximate chromatic adaptation. [6] They assume that the fire in their scene is the primary illuminant, and use the von Kries transform to scale all colors to the color of the maximum temperature. This has the effect of saturating the color of the maximum temperature to white, which might be an okay idea for fire; but it does not apply well to lava or molten metal. For instance, we wanted to be able to represent metal at 800 kelvin, which should be a dark red. If this was the only illuminant in the scene, adapting to it would force it to white, which is not exactly correct.

The paper "Extending the Photon Mapping Method for Realistic Rendering of Hot Gaseous Fluids" by Kang, Ihm, and Bajaj [7], does two interesting tricks with the black body spectrum:

  1. If they are looking for a particular color, they compute the temperature for which the maximum intensity is at a wavelength near that particular color, and then run their simulation at that temperature
  2. Additionally, they allow artists to specify the range of wavelengths over which to evaluate the black body radiation, which biases the resulting color.

We chose not to apply either of these techniques because we wanted to base our results off a reasonable temperature for the materials we were looking at, instead of choosing a color and matching it.

As it turned out, our black body code just had one of the constants off by scale factor (m2 instead of nm2), which threw off the whole computation when it was divided by wavelength5. Going over the code in detail found this error and made the resulting Le in the right range and color.

The challenge here was that given the sheer number of parameters to tweak, how to find a parameter set that captured the surface detail of the particular scene we were trying to reproduce. The tweaking application helped, but it only went so far, as many times the same set of parameters produced a decidely different image when run through pbrt. In the end this proved a huge time-sink, as quite a bit of brute-force tweaking had to be done.

The second challenge was that the black body routines expected values from the texture in the range [0,1]. Since the computed values could be arbitrary we run a sampling pre-process in order to derive coefficients such that the valuse are normalized to be in the desired range. Though this smapling process does take some time at start-up, it is completely dwarfed by the actual rendering time. This sampling routine is currently a bit naive and brute-force in its approach, it could most likely be made more elegant but for our purposes it worked well.

For a long time during development, we would get a number of VERY bright pixels. We noticed that this problem was especially bad at the intersection of a light shape with a non-light-shape, although sometimes they would appear on the surface of the light shape itself. It turned out that the distance between the shading point and the light emitting point was so small that the PDF returned by the light was very close to 0 (but not quite zero), and was causing the returned radiance to be extremely high. We worked around this by changing the PDF computation to ignore rays where the distance squared was too near to zero and would cause these undesirable sparkles.

http://www.stanford.edu/~lthendri/cs348b/final/blobs_sparkles.png

Trying to come up with a high-quality scene that showcased our work took quite a lot of time. Using the inpiration images as references we worked in Maya and 3ds Max to model each scene from scratch. Technologies such as NURB surfaces and metaballs, among others, allowed us to capture some of the organic appearance of lava. Converting these scenes from their native format to pbrt files was also a challenge in and of itself.

http://www.stanford.edu/~lthendri/cs348b/final/wireframe.png

Future Work

One problem we had with procedural texturing is that it only allows control on a micro-level, whereas we also desired control on a macro-level. That is to say, it is difficult, if not impossible, to ask for a texture that is cool over a large area in a particular region, which is something we needed for one of our scenes. We overcame this limitation in modeling - each distinct region was modeled as a separate shape such that we could define the overall temperature range for that specific shape. In the end this did not add to the overall polygonal complexity of the scene, it just meant more work in the parameterizing of each shape.

References

[1] D. E. Ebert, F. Kenton Musgrave, Darwyn Peachey, Ken Perlin, and Steven Worley. Texturing and Modeling: A Procedural Approach. Third Edition. Morgan Kaufmann Publishers, 2003.
[2] Steven Worley. 1996. A Cellular Texture Basis Function. In Proceedings of SIGGRAPH '96, Computer Graphics Proceedings, Annual Conference Series, New Orleans, Louisiana, pp. 291-94
[3] Ken Perlin and Fabrice Neyret. Flow Noise. Siggraph Technical Sketches and Applications, August, 2001, pp 187.
[4] Fabrice Neyret. Advected textures. Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation, July 26-27, 2003, San Diego, California
[5] J. Robert Mahan. Radiation Heat Transfer : A Statistical Approach. Wiley-Interscience; Bk&CD-Rom edition (June 15, 2002)
[6] Nguyen, D., Fedkiw, R. and Jensen, H., "Physically Based Modeling and Animation of Fire", SIGGRAPH 2002, ACM TOG 21, 721-728 (2002).
[7] Kang, Ihm, Bajaj. Extending the photon mapping method for realistic rendering of hot gaseous fluids: Natural Phenomena and Special Effects, Computer Animation and Virtual Worlds, Volume 16 , Issue 3-4 (July 2005)

RyanSmith/FinalReport (last edited 2006-06-11 00:12:56 by LeeHendrickson)