When looking at this slide, I was wondering how many seconds it would take to render one pixel.
How do we do get a number in sec/pixel from 1GFLOP/pixel and 100ksec/frame?
@bainroot, I think that these are two different computations. The slide is trying to show how much compute resources (in terms of floating point operations) it takes to render a single pixel, while the actual time necessary to shade a given pixel could just depending on how many fragments of the geometry are visible in that pixel.
I think it all connects: 29 hours/frame -> 104k seconds / frame. Then each frame is assumed to be 1M pixels (low, but close enough for estimates). So 104k seconds / 1M pixels = 0.104 seconds / pixel.
That also works out since it's saying 1 GFLOP / pixel but a 10 GFLOPS CPU at the time), so that also works out to 0.1 seconds / pixel.