3D Painting on Scanned Surfaces

Maneesh Agrawala <maneesh@pepper.stanford.edu>
Andrew C. Beers <beers@cs.stanford.edu>
Marc Levoy <levoy@cs.stanford.edu>

Abstract

1. Introduction

2. System Configuration

3. Data Representation

4. Methods

4.1 Object--mesh registration

4.2 Painting

To paint a three-dimensional surface we must determine where new paint is to be applied. The tip of our paintbrush has a 3D shape associated with it which defines the volume within which paint is applied (see figure 3). In general this brush volume can be any 3D shape. The most straightforward painting algorithm would be to paint every vertex that falls within the brush volume. We can think of this approach as filling the entire brush volume with paint using a 3D scan-line algorithm to step through all the voxels within the volume. The drawback of this approach is that the mesh is likely to be relatively flat within this volume, therefore not filling much of it. This volume-fill algorithm would search through many empty voxels.

Our approach is to first find a vertex on the mesh that is within the brush volume. We then perform a breadth-first flood fill of the mesh from this seed point. The vertex on the mesh closest to the ray extended along the brush direction from the sensor position is used as the seed, as depicted in figure 4.

Although we poll the tracker for the position of its sensor at about 30 Hertz, the sampling rate is not fast enough to produce a smooth stroke as the brush is swept along the object. For the paint to be applied smoothly, without gaps, we need to fill the surface with paint along a stroke. The flood fill idea can be modified to account for this, coloring vertices within the volume defined by sweeping the 3D brush shape along a stroke connecting successive sensor positions. In our system, we connect successive positions using a linear stroke. Thus, for a sphere brush we would sweep out a cylindrical volume with spherical end caps along the stroke.

One problem for the flood fill algorithm is that it can not correctly handle all surface geometries. Consider a surface with a small indentation. If we place the brush directly above the indentation we should be able to paint the surfaces on either side of it. However, the flood fill brush will only paint one side of it, because it floods out along the mesh surface from the seed point as shown in figure 5(A). This problem could be prevented by performing a volume-fill within the brush geometry, as in figure 4, rather than flood filling out from the seed point along the mesh surface. In practice, we have never encountered a surface geometry for which the surface flood fill causes noticeable anomalies.

Another problem with this algorithm is that mesh triangles which are occluded to the paintbrush may be painted. The correct solution to the problem would be to do a complete visibility test before painting a vertex to ensure that the vertex was visible to the brush. Because this test is very expensive and would hinder interactive performance, we only check that the dot product of the vertex normal and the brush orientation is negative. This ensures that we only paint vertices that are facing the brush, but there are still some cases where we might paint occluded triangles, as shown in figure 5(B). In this case the flood fill seed point falls on the left side of Peak B. As color floods out from the seed point along the left side of Peak B, points that are occluded by Peak A will be painted. The volume-fill approach would be no better than the flood-fill approach at handling this mesh geometry. Both methods fail because they do not check for occlusions between the tip of the brush and the mesh surface.

With hundreds of thousands of polygons in a typical mesh it would be impossible to redraw the entire mesh after each paint stroke and maintain interactive performance. Instead, we only redraw the triangles in which at least one vertex was painted. By using the surface flood fill algorithm in combination with this lazy update scheme we can interactively paint large meshes.

4.3 Brush effects

4.4 Combating registration errors

5. Results

6. Future Directions

7. Conclusions

8. Acknowledgments