Before we proceed any further, we want to reiterate a few key points. First, our conclusion is based solely on the assumption that higher frequency RAM has increased timings. There are a few models available that have both higher frequency and tighter timings which certainly will have a performance edge (although we will leave the quest of how much to a future article), but they are certainly not the norm. Second, our testing was all done with a dedicated video card and as such these results do not apply to systems using onboard CPU graphics.
With those caveats out of the way, you can clearly see how tightly balanced RAM frequency and timings are. While a few of our benchmarks did show differences with the higher frequency RAM (WinRar in particular) the performance gain by using high frequency RAM was highly inconsistent. Especially considering that some of our benchmarks showed worse performance with the higher frequency RAM, it makes it impossible to make a blanket statement regarding whether frequency or timings are more important. Add in the fact that on many of our benchmarks, the performance result with the Intel-based system were the opposite of what we saw with the AMD-based system, the results are so sporadic that the answer is simply not very straight-forward.
Another factor to take into account is the fact that high frequency RAM in general also has a higher failure rate. From our part failure reports, just the small jump from 1333MHz to 1600MHz RAM brings about a threefold increase in failure rates. Now, this is the difference between a .8% failure rate on DDR3-1333 versus a 3.9% failure rate on DDR3-1600, but it is still a notable increase.
So what does this all mean? Based on the results from our testing, we see no reason to use higher frequency RAM with increased timings unless the system will be used for a very specific application that you know will benefit from the higher frequency. The increase in cost and failure rates is simply not worth the very application-specific performance gains.