Simulating the Visual Experience of Very Bright and Very Dark Scenes
Animated demonstration of our system. We propose to attach a gaze tracker to a standard display (a). With the knowledge of the user's gaze fixations, our system can synthesize several cues we experience when looking at very bright or dark objects. A demonstration of our system can be seen in (b). In the animation, the red dot indicates a sequence of user gaze fixations; please view the animation to full screen and follow the red dot. A representative subset of frames is shown in (c), where arrows indicate the fixations. Frame (2) shows the appearance of a simulated bleaching afterimage generated by the fixation in frame (1); we propose explicitly presenting this afterimage to the viewer as shown in the figure. Frame (3) shows simulated adaptation to the dark area of the image; we propose displaying the image synthetically adapted as shown in the figure. Frame (4), the very next frame in the animation, shows that local adaptation to a dark region can briefly boost the perceived brightness of other regions. Finally frame (5) shows adaptation afterimages for colorful stimuli. Lamp image courtesy of Fairchild [2008]. Abstract
The human visual system can operate in a wide range of illumination levels, due
to several adaptation processes working in concert. For the most part, these
adaptation mechanisms are transparent, leaving the observer unaware of his or her
absolute adaptation state. At extreme illumination levels, however, some of
these mechanisms produce perceivable secondary effects, or epiphenomena.
In bright light, these include bleaching afterimages and adaptation afterimages,
while in dark conditions these include desaturation, loss of acuity, mesopic
hue shift, and the Purkinje effect.
In this work we examine whether displaying these effects explicitly can be used to extend the apparent dynamic range of a conventional computer display.
We present phenomenological models for each effect, we describe efficient computer graphics methods for rendering our models, and we propose a gaze-adaptive display that injects the effects into imagery on a standard computer monitor.
Finally, we report the results of psychophysical experiments, which reveal that while mesopic epiphenomena are a strong cue that a stimulus is very dark, afterimages have little impact on perception that a stimulus is
very bright.
Pre-print: gazehdr.pdf (54MB) Video: gazehdr.mp4 (290MB) Slides: gazehdr_s2015_talk.key (248MB, keynote), gazehdr_s2015_talk_exported.pptx (237MB, auto-exported powerpoint) Citation: David E. Jacobs, Orazio Gallo, Emily A. Cooper, Kari Pulli, and Marc Levoy. 2015. Simulating the Visual Experience of Very Bright and Very Dark Scenes. ACM Trans. Graph. 34, 3, Article 25 (May 2015), 15 pages. DOI=10.1145/2714573 http://doi.acm.org/10.1145/2714573 Bibtex: @article{Jacobs:2015:SVE:2774971.2714573, author = {E. Jacobs, David and Gallo, Orazio and A. Cooper, Emily and Pulli, Kari and Levoy, Marc}, title = {Simulating the Visual Experience of Very Bright and Very Dark Scenes}, journal = {ACM Trans. Graph.}, issue_date = {April 2015}, volume = {34}, number = {3}, month = may, year = {2015}, issn = {0730-0301}, pages = {25:1--25:15}, articleno = {25}, numpages = {15}, url = {http://doi.acm.org/10.1145/2714573}, doi = {10.1145/2714573}, acmid = {2714573}, publisher = {ACM}, address = {New York, NY, USA}, keywords = {Adaptive displays, afterimages, gazeaware displays, mesopic vision}, }
|