Final Project: Citrus Marmalade
Sonny Chan and Phaedon Sinis
A Sunday walk through the nearby California Ave market in Palo Alto is a dazzling sensory experience. Taste samples of fresh sorbet and savory Afghani spreads are accompanied by the aroma of freshly-baked naan bread and the sounds of children dancing to street musicians. Across the market, the visual display of produce and prepared foods is carefully crafted to make the products as appetizing as possible.
This season, new offerings of citrus marmalade at the Happy Girl Kitchen stand caught our attention: Meyer Lemon ginger, pink grapefruit, blood orange and rosemary. For a true fan of citrus, only a quick taste is needed to close the sale, but the visual composition is also stunning. The tall, thin glass jar allows sunlight to permeate the entire jar, causing the marmalade to glow. The marmalade itself is densely packed with carefully sliced arcs of orange peel. And tiny air bubbles, suspended throughout the gel, add extra texture to the scene.
The reference image was taken indoors, in late-afternoon direct sunlight using a Nikon 50mm lens.
Algorithms and Techniques
Rendering the exquisite visual effects of the marmalade in sunlight with a physically-based approach is a terrific challenge, and one that requires a full understanding of the absorption and scattering effects in the gel volume and implementing significant extensions to PBRT.
The core algorithm we used for rendering the marmalade was volumetric photon mapping, which constituted the primary extension to PBRT for this project. We began by reading Jensen's book, "Realistic Image Synthesis Using Photon Mapping" and reviewed more recent literature on radiance estimation in volumetric photon mapping.
Before coding, we analyzed the finite state machine for photon mapping (PBRT text, page 809). Because the elements of our scene are lit primarily by indirect illumination, the existing photon mapping integrator needed to work perfectly as we added support for volumetric photon mapping. Maintaining the existing photon states is essential for proper rendering the non-volumetric parts of our scene, and was one of the most challenging aspects of this project. The following diagram shows how we, with careful analysis, decided to extend PBRT's photon mapping integrator:
Photon shooting and gathering through the volume is standard and closely follows Jensen's text. We encountered many subtleties in rescaling the spectral components of photons and the proper use of Russian roulette as a means of attenuating the total light transmission, and the interactive debugging techniques described below were essential for visualizing and reasoning about these issues.
We implemented two different techniques of photon density estimation for computing the radiance of a ray through our marmalade volume. The first was a more traditional 3D kernel (Epanechnikov kernel, since I couldn't find or derive a 3D biweight version) taken around small radius as the ray was marched through the volume to integrate irradiance. The second technique we implemented was a beam radiance estimate, where the photon density was estimated over a cylinder of small radius that encapsulates the ray (Jarosz et al. 2008). We ended up going with the Epanechnikov kernel estimator for our final render because the beam estimate give us blurrier results and didn't give us as much of a performance improvement as observed in Jarosz's scenes. Perhaps this is because we only implemented the fixed radius beam estimate (easier version) -- upgrading to the variable kernel method they describe would probably have given us similar results with much faster rendering times.
After volumetric photon mapping was fully implemented and debugged for a homogeneous volume, we extended the implementation to render heterogeneous gels. Our model is simple: A heterogeneous gel is a combination of two homogeneous gels, with a 3D density texture determining the blending ratio of gel1 and gel2. This allows each gel to have its own scattering/absorption/phase coefficients. In the case of orange marmalade, we modeled gel1 with a low-albedo yellow volume and gel2 as a high-albedo orange volume with high absorption of green and blue wavelengths, creating a deep red glow in the inner regions of the volume. Ray marching is an effective way to integrate over a heterogeneous texture, measuring the differential transmission coefficient at each step along the ray. The procedural texture itself is a density map computed with minor modifications to PBRT's fractional brownian motion noise function.
A first pass at estimating these coefficients was achieved by photographing the interaction of cross-sections of jam with white light and with red, green and blue lasers. Color matching was effective and further adjustments were tweaks as we adjusted scene composition and lighting. Details are discussed below under the heading "Auxiliary Test Scenes".
Suspended Articles (Peel & Bubbles)
We picked a fresh orange from the backyard, and carefully sliced and photographed using a macro lens. The photos provided a basis for a beautiful texture that we used for the bits of orange rind and peel that are sprinkled throughout the marmalade gel (with appropriate translucency in the alpha channel). The photograph is shown on the left, and the RGB and Alpha images of our texture is on the right.
A randomized procedural technique parameterized on angle and slice was used to generate the bits of orange so that no two pieces would look identical. We wanted to space the oranges roughly evenly through the marmalade, and ended up using a Delaunay refinement algorithm to generate the positions of the orange pieces. We used the Delaunay refinement algorithm in the Computational Geometry Algorithms Library (CGAL) to tetrahedralize a cylinder the size of the inside of our jar, and the positions of the suspended articles were taken to be the vertices of the resulting Delaunay tetrahedra. Positions for the air bubbles within the marmalade were generated in the same way.
A sunbeam entering through a moderate-sized window provided the primary light for our scene. We were able to model an approximation of the actual room on Bryant St. where the photos were taken. With surface photon mapping, this geometry provided us a means to capture the indirect lighting of our scene due to the elements (walls, floor, etc.) of the room itself. However, we were unable to model the geometry of anything outside the room, and hence could not use photon mapping to capture the indirect illumination coming from outside the window. Instead, we decided to pursue an environment light for this purpose.
We photographed a 2-inch chrome ball bearing with a 300mm telephoto lens on Bryant St. to create a 360-degree spherical environment map. This is the actual scene outside of Phaedon's apartment, where the indoor reference photo was taken. Sampling this environment map would provide an estimate of the indirect illumination coming from the sky, tree leaves, houses, etc. outside the window.
We wrote an OpenGL program that we could use to test our environment map captures in real-time. The same program houses the code to procedurally generate the orange slices, and allowed us to examine them close-up within the jar for maximal quality control before inserting everything into our PBRT scene.
The infinite area light that exists within PBRT was not particularly suited for use with photon mapping on our scene, since the only light that enters our room from the environment goes through that single small window. Sampling the light source to shoot photons just generates a random ray that hits the bounding sphere of our room, which most of the time just gets stuck to the outside walls and is wasted illumination. Worse yet, some photons get stuck to the outside corners, which, with the PBRT photon mapping integrator, results in light leaking into the room, creating a very visually jarring effect!
We solved this problem by extending PBRT with a new type of area light: the "window light". This light works more or less light a regular area light, except when sampled, it draws a radiance estimate from an environment map image using the direction vector of the sample. This extension allowed us to achieve 100% efficiency with photon mapping -- go green!
The two images below show our scene lit with just the sun (spotlight), and with just the window light. Notice that a combination of the two lighting modes produces the most natural illumination for our final scene.
Other scene elements
The room was initially modeled in Blender so that we could position the jar of marmalade, the camera, and additional articles to create the scene composition we desired. This was a lot easier than iterating on the scene through PBRT by twiddling a few transformation numbers every time.
We also modeled the jar, the spoon, and the book in Blender. A lab mate and robotics engineer, Reuben Brewer, took offense to how crude and approximate our jar was, and assisted us in remodeling an exact replica of the marmalade jar (see acknowledgments below). The Blender composition of our final scene is shown below.
100% Certified Organic
Because the real orange marmalade was certified organic, we decided to take a 100% organic approach to this project too! This means that our final scene and render were produced with no artificial ingredients whatsoever. Every element of our scene was hand-made or photographed by members of our team (and a few close friends who felt inspired to help us out). We took all photographs used in this project, including the reference image, the environment maps, the floor texture, and high-resolution macro images of orange peels. The book's cover was scanned for the texture, and the spoon and book were modeled in Blender. An exact replica of the marmalade jar was created in SolidWorks, then post-processed in Blender.
Auxiliary Test Scenes
- Most of our development and testing was on the following scene. The cones in the background were helpful to evaluate refraction and translucency of the volume, while the gazing balls provided a full view of the room, and a rear view of the jam.
- To develop an intuition for the interplay of scattering and absorption, we ordered red, green, and blue lasers and took photographs through a cross section of grapefruit marmalade. The gel had no solid chunks, and we removed denser regions by filtering the gel through a tea strainer. This gave us a nearly homogeneous gel for more accurate evaluation of the properties. The photograph shows all three lasers, plus an ordinary flashlight with white(ish) light.
- To match these photos, we created a rectangular test volume with 3 lasers and a white beam shining through. The left volume uses PBRT's single scattering integrator, while the right volume uses our volumetric photon mapper. Note that the projection of the light beams through the two volumes onto the plane at the bottom matches in color and brightness. This was an extremely useful debugging technique to ensure that our rescaling of the wavelengths, and overall power of photons was correct for the multiple scattering:
- The laser scene also was useful for testing our implementation of the heterogeneous volume case. In this screenshot, we see a single volume which is composed of two gels, one with almost no absorption or scattering and one that approximates the jam. The "noise function" is a simple, deterministic waffle pattern, and the scene verifies that no photons are deposited in regions where the "invisible gel" is located. The frequency of the pattern can be adjusted to ensure that the ray marching step size is small enough to capture high-frequency variations:
- For one of our measurements, we filtered grapefruit marmalade through a tea strainer to obtain a sample that was as homogeneous as possible. We then placed it on a diffuse spherical lamp and used a Sekonic light meter to measure the transmission through a thin layer of the gel. Rendering a cross-section of the gel was useful for close examination of our volumetric rendering and the interaction of orange slices with gel.
Debugging Tools and External Programs
- Debugging the subtleties of photon shooting with multiple scattering through a heterogeneous medium would be nearly impossible without an interactive tool. Fortunately, we added a debug flag to PBRT that exports the contents of the photon maps into formatted text files, and created an OpenGL app to fly around the scene. The photons were useful for examining certain problems, but to properly debug multiple scattering, we created a version that traces the paths and the photon deposits left by each photon:
A PR2 Alpha robot rendered our final scene for the rendering competition. It worked for about 7 hours overnight to generate the image, and never complained once the whole time.
Sonny worked on:
- a volumetric medium material and making pbrt track rays' entry and exit through media materials
- two different volumetric photon density estimation techniques for computing volumetric radiance
- procedural generation and placement of orange slices and bubbles
- environment map window lighting
- PBRT scene assembly
- idea generation for debug scenes and cross sections
Phaedon worked on:
- volumetric photon shooting
- ray marching through heterogeneous volume
- procedural generation of 3D texture, based on PBRT's FBM noise
- estimation of scattering, absorption, and phase function coefficients for both gels contributing to heterogeneous texture
- actual scene photography and composition; environment map photos
- project idea generation
We both worked on:
- active discussion of photon mapping theory and construction of the modified state machine
- gel observation and measurement
- mesh modeling in Blender
- OpenGL tools for efficient visualization and debugging
We would like to thank:
- Pat Hanrahan for mentorship, guidance and for supplying us with 2-inch ball bearings for the environment map
- Reuben Brewer for significant improvements to our glass jar and lid models
- Lindsay Long for her artistic skills on the wallpaper
- Tanya Sleiman for insightful recommendations on scene composition
- The friends who gifted the (real) Alice Waters cookbook
The PR2 Alpha robots in the BioRobotics Lab for the generous loan of CPU time.
Cerezo, E., Perez, F., Pueyo, X., Seron, F. J., & Sillion, F. X. (2005). A survey on participating media rendering techniques. The Visual Computer, 21(5), 303-328.
Jarosz, W., Nowrouzezahrai, D., Sadeghi, I., & Jensen, H. W. (2011). A comprehensive theory of volumetric radiance estimation using photon points and beams. ACM Transactions on Graphics, 30(1), 1-19.
Jarosz, W., Zwicker, M., & Jensen, H. W. (2008). The Beam Radiance Estimate for Volumetric Photon Mapping. Computer Graphics Forum, 27(2), 557-566.
Narasimhan, S. G., Gupta, M., Donner, C., Ramamoorthi, R., Nayar, S. K., & Jensen, H. W. (2006). Acquiring scattering properties of participating media by dilution. ACM Transactions on Graphics, 25(3), 1003.