Assignment 3 - Photon Mapping

Handed out: Tuesday May 13

Due: Thursday May 22


Photon mapping is a biased Monte Carlo technique for light transport. Light is transported along a large number of paths from the light sources in the scene, and a data structure is built to record the distribution of illumination (the photon map).  As the scene is being rendered, the photon map can be used to estimate the amount of illumination at any particular point in the scene.


1. Read the paper "Global Illumination Using Photon Maps" by Henrik Wann Jensen.  You may find Henrik's book "Realistic Image Synthesis Using Photon Mapping" a useful reference, though it is not necessary for this assignment.


2. Review Section 15.7 of Physically Based Image Synthesis, which has a basic implementation of photon mapping.  The current implementation only records light that has been focused by reflection or refraction from perfectly specular objects; this is used for rendering caustics.  In this assignment, you will be extending this implementation to include all modes of indirect light transport.  You will also want to read the section in Appendix A that describes the kd-tree data structure implementation used for photon mapping.


3. There is an error in the BSDF::sample_f() function in lrt/core/  Remove line 432, "*wt = 0.f;", and recompile your lrt executable.  Fixing this is critical for this assignment.


4. Make a copy of the integrator in the integrators/ directory.  Name it and edit the makefile so that myphotonmap is also compiled when you build lrt.


5. Render the caustic.rib example scene with your integrator.  Ensure that caustics appear in the rendered image.  Render the cornell.rib example scene; note that there is no indirect illumination in the image.


6. Extend myphotonmap so that it computes and stores two photon maps.  The caustic map should be just like the caustic map that is currently computed, storing photons that have left the light, hit one or more specular surfaces, and have then hit a non-specular surface.  The indirect map should be used for the other photons: photons that hit one or more non-specular surfaces, possibly after hitting specular surfaces.  Add an integer parameter to CreateSurfaceParameter() called “nindirectphotons” to record the number of indirect photons to be stored.  Keep shooting photons until enough of each type have been created.


TIP: When sampling reflected directions for photon paths, use the BSDF::sample_f() method instead of BSDF::f_delta(), the function used in the current implementation, which only samples specular directions.  BSDF::sample_f() returns a bool value in one of its parameters that tells you if the sampled direction was from a specular component of the BSDF; this will be useful.  It also returns the probability density for the direction that it sampled, which will also be useful.  (Note that if it samples a direction from one of the BSDF's specular components, then the probability density value that it returns will be zero and can be ignored; be careful not to divide by zero.)


7. Use the photon maps at rendering time.  First, write a version of the integrator that doesn't trace any shadow rays at all, but only uses the photons to approximate all illumination at the point being shaded.   The resulting image will be blotchy, but should be a reasonable approximation to the lighting in the scene.


8. Extend the integrator to support a "final gather" option.  Add an integer parameter in the CreateSurfaceIntegrator() function that takes a "finalgather" value to enable this (otherwise just use the maps directly as in part 7).  When this is enabled, continue to use the caustic map for caustics at the point shaded, but sample the light sources and trace shadow rays for direct illumination (as the current implementation does), and sample the BSDF at the shading point to trace rays out into the scene.  At each point intersected by a final gather ray, use the photon maps to estimate the illumination (don't continue following further bounces).


TIP: The LatinHypercube() function from core/sampling.h can give you reasonably-well distributed sample positions in [0,1]^2 to pass to the BSDF::sample_f() method for importance sampling final gathering rays.


Extra Credit: You can get extra credit on this assignment by improving the efficiency of the photon map implementation.  There are two main ways to go about this.  First is improving the algorithmic efficiency, e.g. by improving the sampling algorithms used.  Note that any algorithmic improvements must be robust to all sorts of scenes, not just the specific example scenes we have handed out.  The other option is implementation efficiency, by improving the runtime speed of the basic algorithms in and kdtree.h.  (We won't give extra credit on this assignment for speeding up photon mapping by improving the speed of ray-object intersections!)




1. What are the arguments to the sample_f() function?  What do they mean?


Spectrum sample_f(const Vector &wi, Vector *wo, Float u1, Float u2,
                Float *wt, bool *specularBounce) const;
wi = incident direction (world space)
wo = output variable for sampled outgoing direction (world space)
u1, u2: uniform random numbers between [0,1].  (e.g. from RandomFloat()).
*wt = value of probability distribution function for the direction sampled
*specularBounce = true if the direction was sampled from a specular part of
the BRDF.
It returns the value of the BSDF for the (wi, *wo) pair of directions

So for example, to compute reflected radiance, given a function Li(x, w) that returns radiance along w from the point x, (as the reflection equation), you could write code like:

L = 0
for i = 0 to nsamples
   fr = sample_f(wi, &wo, RandomFloat(), RandomFloat(), &pdf, &wasSpecular)
   costheta = Dot(wo, N);  // N is surface normal
   L += fr / pdf * Li(x, wo) * costheta
L /= nsamples


2. Wait a minute—how come sample_f() takes a wi direction and returns a *wo direction?  Shouldn’t it take the outgoing direction and return the incident direction?


Yes, you’re right.  The function was written with those two swapped.  But since BRDFs are reciprocal, this doesn’t matter (other than the fact that it causes confusion.)


3. What do we do with the returned wt value from sample_f() for rays where isSpecular is true??


(This is confusing due to a deficiency in that sample_f() function’s interface—sorry about that!)  You should ignore that value and instead multiply the BRDF value by by bsdf->NumComponents().


4. How do we decide how many final gather rays to trace?


Add an integer parameter “finalgathersamples”.


5. I don’t understand how we decide when and where to store photons.


There are three key rules to keep in mind:





6. Wait a minute—are you saying that I should store photons at their first intersection in the indirect map?  But won’t I be double-counting direct light?


Yes, you should store those first intersections in the (misnamed) indirect map.  (Assuming that they hit a surface that isn’t completely specular.)  Then, when you use photons for estimating illumination, there are two scenarios:


You should be able to convince yourself that either of these approaches will neither ignore nor double-count illumination.


7. I think I did everything correctly, but my pictures are coming out wrong!


Here are a few things to look into:



8. Why might we use the LatinHypercube() function to generate random numbers for the final gather?  How come we shouldn’t just call RandomFloat()?


Either way you’ll get the right result in the limit.  It turns out that the sample points from LatinHypercube() are distributed in ways that lead to lower variance in the final images (try both ways yourself and compare!).  We’ll discuss this in more detail in a week or so.


9. How come when I try to get the surface normal from dgShading.Nn in the Intersection object, I always get a value of (0,0,0)? The normal in dgGeom.Nn seems to be correct.


dgShading isn't initialized until after you call GetBSDF() from the Intersection object. This is admittedly really unintuitive.




Turn in your finished homework using the cs348b submit script:


cd my-lrt-directory/integrators



Make sure you hand in and a README file. The README should contain instructions for compiling and running your code, and a description of anything non-intuitive you've done. Please do not submit any executables or object (.o) files. We'll be building your submissions ourselves, and those just take up space.




This homework, like the other programming assignments, will be graded according to the following system:


0 - Little or no work was done
* - Significant effort was put into the assignment, but not everything works correctly.
** - The algorithm you implemented was functionally complete; it passed all of our tests.
*** - Your implementation was substantially faster or more flexible than a straightforward implementation.