Gavan Kwan
Click on Image to Enlarge
The first
step in rendering the images was creating high-resolution heightfields of the
ocean wave pattern. This was done in Matlab and then exported as a heightfield in
a .rib file. Using a technique involving frequency-domain filtering described
in Mastin, et. al (Fourier Synthesis of Ocean Scenes), the matlab code first
generates a white noise image, then takes the two-dimensional Fourier
transform. The frequency-domain image is then filtered using a modification of
the Pierson-Moskovitz frequency distribution for downwind ocean waves,
directionally attenuated by the angle away from the downwind direction. In this
case, the downwind axis was the vertical y-axis, so the angle ranged between 0
when completely aligned with the wind to pi/2 at perpendicular directions.
Once
filtered, some noise is added to the frequency image to create a rough
appearance for the surface to add a bit of realism. The wave generating function
accepts windspeed as a parameter to create different types of waves (i.e. long,
rolling waves vs. short, choppy ones). To create a realistic-looking wave
pattern took a bit of experimentation, since the Pierson-Moskowitz model alone
generates patterns that look a bit too clean for real ocean scenes. To simulate
nature a bit, we added wave patterns at three different windspeeds to create a
more rough, natural looking combined wave pattern.
The waves were rendered as a heightfield RenderMan primitive, with a few modifications to the implementation in lrt. While the basic lrt implementation of heightfields refines into a trianglemesh, we used the regular grid accelerator from assignment 1 and also implemented linear interpolation between the normals at each vertex of the heightfield. The normal at each heightfield vertex was generated by looping through each triangle in the heightfield, and calculating the normal vector to each triangle. The value of this normal, scaled by 1/6, is added to each vertex of the triangle. Once a heightfield-ray intersection occurs, we know which triangle of the heightfield is involved in the intersection. Using the u,v barycentric coordinates within the triangle, we interpolate between the vertex normals to create a smooth heightfield surface.
The surface property of the heightfield was set to the Renderman “shinymetal” surface, which has a reflective component. The rays reflecting from the ocean surface find the right color for that point on the heightfield when they intersect with the large texture-mapped light source in the background.
In order
to complement the ocean scene, we decided to make a fake sky by creating a
texture-mapped area light source over the ocean. The texture-mapped area light source generates light rays just
like the diffuse area light sources of assignment 3, but colors the rays that
are emitted according to a specified texture map. For the image to appear as a sky over an ocean, we placed a
rectangular texture-mapped area light source running from the back of the ocean
to some location above and behind the camera.
Thus the image would appear to be a sky on top of the ocean. The reflection of the area light source's
rays off the ocean would help integrate the ocean and the sky into a continuous
scene. This also allows any sky image
to be substituted in as a texture map and integrated into a scene with an
ocean.
In order
to create a texture-mapped area light source in the sky, we had to create our own
rectangle primitive, since the default implementation of rectangles (special
case of 4-sided polygons) would divide the rectangle into triangles. This would cause the texture map to applied
to each triangle, and add a discontinuity to the image of the area light
source.
While we
could have extended the interface to our ray tracer to take a new geometric
object called rectangles, this would waste time on tinkering with the parsing
of input. Hence, we just assumed that
any four point polygon was a rectangle, and created a rectangle primitive. Determining the normals and computing the
intersections with light/eye rays were very similar to those algorithms for
triangles.
A
texture-mapped area light source emits colored light at a specified intensity. In terms of ray-tracing, this involves
finding the location on the area light source where the light is being emitted
from and determining what color in the texture map that location maps to.
In LRT,
since eye rays were traced, we simply checked for collisions with the area
light source, checked to see if the light source was texture-mapped, looked up
the color in the texture map that corresponds with the location of the
collision, and scaled the intensity of the light by that color.
The one
caveat to generating a sky area light source is that direct rays from the sky
to the camera need to be scaled down in intensity so the sky does not appear to
be all white. This was implemented by
checking the ray depth, and not including light intensity in the light ray
color calculation if the ray extended directly from the camera to the sky.
Matlab Code (Wave Pattern Generation)
References:
Gary A. Mastin, Peter A.
Watterberg, and John F. Mareda.
Fourier Synthesis of
Ocean Scenes. IEEE Computer
Graphics and Applications, p. 16-23, March 1987
Simon Premoze, Michael Ashikhmin.
Rendering Natural Waters. PDF
A. Fournier and W.T.
Reeves.
A Simple Model of Ocean
Waves. Computer Graphics (SIGGRAPH
’86 Proceedings), volume 20, p. 75-84, Aug. 1986
D. R. Peachey.
Modeling Waves and Surf. Computer Graphics (SIGGRAPH ’86 Proceedings),
volume 20, p. 65-74, Aug. 1986