Raw FPS averages are inherently flawed

Posted by Bluedot55@reddit | hardware | View on Reddit | 10 comments

To make a simple example, lets take 2 hypothetical GPUs in 2 games.

- GPU 1 GPU 2
Game 1 100 fps 50 fps
Game 2 250 fps 500 fps
Total average fps 175 275

In this example, each GPU had 1 game where it was 100% faster then the other gpu, but by virtue of one game being lighter to run, and running significantly faster on both GPUs, that game has an outsized effect on the average. Beyond that, I believe most people would agree that the difference between getting 50 and 100 fps in a game is far more noticeable then getting 250 vs 500.

Frame time averages

There's a few ways to give a more accurate number here. An argument could be made that rather then the averages being done of FPS, an average of frame times would give a better representation of the relative performance. This inverts the weighting, making each percentage difference matter more when the FPS is lower, meaning a difference between 45 and 60 fps is more impactful then a difference between 150 and 200.

Relative averages

Alternatively, the overall average could be a average of the relative performance of the products, so rather then a set FPS, each game was scored as a percentage of the highest performing product. This would guarantee that every game gets an equal weighting in the end result, so a difference between 45 and 60 in one game is balanced out by a difference of 200 vs 150 in another.

9070xt review example

For a real world example of how this would effect comparisons, I ran the numbers with the different methods using Techspot/HWUnboxed's review of the 9070xt, and how it compares to the 5070ti in 1440p. Numbers are measured as a percentage of the performance of the 5070ti.

Foo Relative performance
HWUnboxed's average 94.4%
raw fps average 91.8%
frame time average 96%
relative performance 95.4%
HWUnboxed's RT average 79.1%
raw fps RT average 80.4%
frame time RT average 57.2%
relative RT performance 73%

I'm not quite sure why my raw averages don't line up with what HWUnboxed themselves had for the multi-game averages numbers, maybe they do some sort of weighting in a similar manner.

Regardless, looking at these, the frame time averages show a smaller gap between the cards when you are looking at non ray-traced titles, but when you add ray-tracing, the gap more then doubles from what the regular average would suggest. With different GPUs and CPUs performing differently in different sorts of games, I think an approach like this may be valuable for getting a better feel for how products actually compare to one another.

TL:DR

FPS averages massively reward products that do very well at light games, even if they do worse at heavier games with lower average FPS.