Unexpected Data

Often when we approach a new test, we are able to make some safe assumptions. For example, when testing the Arnold CPU Renderer, we can assume that more cores will result in a faster render. When testing GPU rendering, it's safe to assume that a 2080 Ti will outperform a 2070. Even though we can make these educated guesses, we still perform the tests because we want to see exactly how much of a difference exists, and if the price difference justifies the purchase. However, sometimes those assumptions are proven wrong.

One recent example is the results from my recent CPU testing in 3ds Max. Part of the testing suite including a GPU rendering pass. I left the test in place even though I wasn't testing the GPU because it didn’t take much extra time, and we thought if anything, we might see a very slight difference between platforms. However, these are the result we found:

As you can see, there is more than a slight difference between the platforms. The Intel 10900K is around 30% faster than the AMD Ryzen 3950X. We fully expected a few percent difference between platforms in favor of either Intel or AMD, but nothing like what we are seeing.

Further confusing the situation is the results of the Intel i9-10900X. It is significantly faster than any other CPU. If you look at the results, you’ll see each CPU within a platform grouped together, with the exception of the 10900X. This appears to be some sort of error. I’ve reran that test numerous times. Deleted the files and reinstalled and still get the same results. Even without this one odd result, the differences between platforms is very fascinating and worth more research.

While it's true we spend a lot of time in Labs verifying assumptions the we, and most people in the tech industry have about a new piece of hardware, what we are really looking for is the unexpected.