The purpose of this article was to find an explanation for why benchmark results available online for the Radeon R9 290X in Quiet mode vary so much from our own testing. By expanding our normal testing procedure, we found two potential reasons for the difference in performance results. Both are very possible, and it may very well be that it is actually a combination of both that is causing the performance discrepancies.
The first potential explanation we found was that the performance of a single R9 290X with the Quiet profile can drop by as much as 5.6% once the system has reached its full load temperature. Putting this in terms of comparing two fictional reviews, Review1 may show the performance of a R9 290X as 58.5 FPS in a specific benchmark if they only ran the benchmark once. Review2, on the other hand, may show the performance of same card in the same setup with the same benchmark settings as 55.8 FPS if they let the system get to full load temperatures before logging the results. Both are accurate, but one is an unrealistic number since you never are going to game for 5 minutes at a time, letting the system cool down between each session. Even if you do loop the benchmark five times like we did in our testing, if you simply take the average of those five runs you would get a result of 56.5 FPS, not the more accurate 55.8 FPS.
If you add a second video card and start testing in Crossfire, the difference is even larger. Even discounting the times when the cards overheated and overrode the Quiet profile's 2200 RPM fan limitation, we saw up to a 12.3% drop in performance once the system was at full load temperature. This time, Review1 would show 91.5 FPS, but Review2 would only show 80.8 FPS. This is a huge difference, and although both are still technically correct, one is clearly an inaccurate representation of what performance you should reasonably expect from a pair of Radeon R9 290X cards using the Quiet profile in Crossfire.
In addition to the question of whether reviewers are correctly letting the system get to full load temperatures before recording benchmark results, this brings up a second problem that is somewhat new to the art of benchmarking video cards. Considering how much of a difference the overall cooling performance of the system makes on a R9 290X in both Quiet and Uber mode, it calls into question what setup reviewers should use when testing video cards. Should they use a high airflow chassis to find the maximum performance, or a more moderate airflow setup that will give lower performance numbers, but be more accurate for the average user?
Add this to the fact that testing multiple airflow setups is not currently the norm, and it calls into question just how meaningful these benchmarks really are. If the results are obtained by using a high airflow chassis, they might be higher than you would see in your own more moderately cooled chassis. And since other video cards are not as susceptible to heat as the R9 290X, you may end up purchasing a card that ends up performing much lower than you expected. Even to the point that a different video card may have been a better choice. Again, this is not the case for all video cards, but the fact that we saw as much as a 7% difference between a pair of R9 290X cards with the Quiet profile in a high airflow setup versus a low airflow setup is absolutely huge. Even in Uber mode, we saw almost a 3% variance depending on whether we had the chassis fan at 5v or 12v.
Any benchmark you find online is always going to just be an approximation of the performance you should expect from the given component, but with how much cooling affects the performance of the Radeon R9 290X, it might be time to make reviewers jobs just that much harder and introduce testing with variable cooling setups.