GPUBench Test: Precision

Back to Main Page
Description
The precision of arithmetic instructions is often called into question on GPUs. The Precision test reports both average and maximum error (either relative or absolute) of a specified instruction over a range of input values. Values computed by the GPU are compared with the values obtained at double precision using math.h library routines on the CPU.
Test Specific Details
Either the precision of single instruction (RSQ, RCP, SIN, COS, EX2, LG2) or all possible instructions can be tested in a single program execution. --min and --max set the range of input values over which precision is tested (--step sets the step size through this range). Precision can either be reported as relative or absolute error using the --relative and --absolute options respectively.
Example Usage
Prints average and max absolute error in the SIN instruction over the range 0 to PI:
precision --sin -m 0.0 -x 3.14 -absolute
Prints average and max relative error in all instructions over the range -PI to PI:
precision --all -m -3.14 -x 3.14 --relative

Commandline Usage

Usage: gpubench\bin\precision.exe <options>
  Options
  -r, --rsq
            Test RSQ instruction
  -p, --rcp
            Test RCP instruction
  -s, --sin
            Test SIN instruction
  -c, --cos
            Test COS instruction
  -e, --ex2
            Test EX2 instruction
  -l, --lg2
            Test LG2 instruction
  -a, --all
            Test all above instructions
  -m, --min
            Starting min value to test
  -x, --max
            Largest value to test
  -t, --step
            Step size
  -v, --verbose
            Print everything
  -d, --decimals
            Print decimal places
  -n, --relative
            Print relative error (default)
  -b, --absolute
            Print absolute error
  -y, --xerror
            Print relative error in input


GPUBench was developed at the Stanford University Graphics Lab.