Optimizing AMD Trinity for Budget GamingWritten on December 11, 2012 by Matt Bach
Jump to Section:
With the ever-increasing performance of computing hardware, for many user there is a good argument to be made for low cost gaming systems based on AMD's Trinity APU. This APU (which stands for Accelerated Processing Unit) is a variation of a standard CPU that combines both a CPU and GPU in one unit. This allows you to get the same performance of a budget processor and video card through a single all-in-one package which can significantly reduce costs.
These types of systems are not the best choice for the most modern games or the highest screen resolutions, but they work great when you are either on a tight budget or want a cheap LAN/kids' gaming PC. But since the amount of graphical power available in these APUs is somewhat limited, it is very important to optimize the system to get the best performance possible. In this article we're going to explore how shared memory, RAM speed, and AMD Dual Graphics affects the performance of a system based on AMD's Trinity APU.
Our main point of focus will be the system's RAM, including the speed of the RAM and how much is set as shared memory, which should have the largest impact on gaming performance. In addition, we will also take a look at AMD's proprietary Dual Graphics technology which lets you to utilize both the onboard graphics and a discrete video card at the same time to theoretically give performance greater than either graphics processor alone.
To perform our optimization benchmarks, we used the following hardware:
|Test System Specs|
|Motherboard||Asus F2A85-M PRO|
|CPU||AMD A-Series A10-5800K 3.8GHz|
|CPU Cooler||Stock AMD CPU Fan (high quality)|
|RAM*||Patriot Viper Xtreme DDR3-1866 8GB (2x4GB) CL9|
|Hard Drive||Western Digital Caviar Blue 500GB SATA 6Gb/s|
|PSU||Antec TruePower 650R 650W|
|Chassis||Antec Mini P180 (Gunmetal)|
*Shared memory set to 2GB. RAM and shared memory may be adjusted according to the testing performed.
|Kingston ValueRAM KVR1333D3N9/4G||1333MHz||9-9-9||1.5||2x4GB|
|Kingston ValueRAM VR16N11/4||1600MHz||11-11-11||1.5||2x4GB|
|Kingston Lovo KHX1600C9D3LK2/8GX||1600MHz||9-9-9||1.35||2x4GB|
|Patriot Viper Xtreme PXD38G1866ELK||1866MHz||9-11-9-27||1.65||2x4GB|
|Patriot Viper Xtreme PXD38G2133C11K||2133MHz||11-11-11-30||1.65||2x4GB|
Note that the fastest RAM natively supported by this APU is 1866MHz, so the 2133MHz is more of an additional reference point than something that we would recommend using in a Trinity-based gaming system.
Shared Memory Information
Since CPUs do not have large amount of RAM integrated to assist in graphical processing, most CPUs with onboard graphics use what is called "shared memory". This is essentially a virtual allocation of parts of the main system RAM for dedicated use by the graphics processor. The amount of shared memory is set in the BIOS and, depending on the motherboard, can allow anywhere from 16MB to 2GB of RAM to be allocated.
In most cases, more RAM is better. The point where shared memory gets muddy is in the fact that the more RAM you allocate to the graphics processing, the less RAM available for the rest of the system. So finding a balance is a very important aspect of optimizing Trinity for gaming.
Tom's Hardware did a great article on video card RAM size and its effect on gaming titled How Much RAM Does Your Graphics Card Really Need? In their testing, they were able to conclude that:
[...]all other factors being equal, a graphics card with 2GB (2,048 MB) should perform exactly the same as a graphics card with 512MB as long as the game's graphics memory requirements are below 512MB.
Since a budget gaming system will only be able to play most modern games at medium to low settings, this indicates to us that we likely should find no reason to set the shared memory above ~1GB. Their testing was in relation to discrete video cards, however, so to test if this applies to AMD APU graphics as well, we ran benchmarks with the shared memory set to 64MB, 512MB, 1GB and 2GB using two sets of RAM (DDR3-1600 CL11 and DDR3-1866 CL9).
Impact of Shared Memory Size on Gaming
During our testing, we found that the results from the 1866MHz CL9 RAM were relatively almost identical to the results from the 1600Mhz CL11 RAM. So, in the interest of keeping this article as uncluttered as possible, we are only going to show the graphs for the 1866MHz CL9 benchmarks. However, if you wish to view the 1600MHz CL11 results, there is a link under each graph that will allow you to view those results as well.
The results for Batman: Arkham City appear to be in-line with the Tom's Hardware article. At 1280x720 with low presets, the amount of shared memory did not matter at all. Once the resolution and settings were turned up, however, 64MB of shared memory was not enough and resulted in a 9% decrease in performance at medium and 7% decrease in performance on high.
Hitman: Absolution is a bit different in that even at the low resolution/settings there is a slight performance drop when the shared memory was set to 64MB. At low, the difference is only 2%, however, so that may very well simply be normal testing variation. 512MB of shared memory has a very slight performance loss at medium and high, but ~.2 FPS is so small that it is well with normal testing variations.
Just like Hitman: Absolution, Alien vs. Predator shows a slight performance drop using only 64MB of shared memory. Once again, the difference is slight (1.5%, 5% and 6.7% performance loss at low, medium and high respectively) but certainly there.
Our final benchmark is not an actual game, but rather a stand-alone DX11 benchmark which enabled us to perform a couple additional benchmark runs. At the lowest settings, we once again only see a performance drop with the shared memory set to 64MB. For the first time in our testing, however, we are able to actually see a difference with 512MB at the higher resolutions and settings.
So what does this mean in terms of real-world performance? At the settings that actually matter for gaming (basically those that allow the game/benchmark to run at over 30 FPS) there was only a 1-2% performance difference between 64MB and 512MB of shared memory, and no difference above 512MB. In fact, none of our benchmarks except Unigine Heaven showed any difference between 512MB, 1GB and 2GB of shared memory at any resolution/settings.
So in conclusion, there is almost no reason to set the shared memory size to larger than 512MB for gaming if you are using onboard graphics. If you want to be 100% sure you are getting the best performance possible, you could set it to 1GB to be safe, but that is more of a precaution than a necessity.
Impact of RAM Speed on Gaming
We have touched on the question of RAM speed's effect on gaming before in the article Breaking the Hype of High Frequency RAM, but that article was focused on high end gaming computers with discrete video cards. Since in this case the RAM is being directly used by the APU, we expect to see a greater difference between the RAM speeds in this testing.
To find out just how much of a difference it makes, we ran another round of benchmarks, but this time with five different RAM frequencies/timings. We know that 512MB is plenty of shared memory for almost any gaming application, but to be safe we set the shared memory to 2GB. We want to make sure that we are not causing any unexpected bottlenecks, so setting the shared memory to 2GB ensures that we will not hit any RAM limit during our benchmarks.
Starting with Batman: Arkham City, there is no performance variation on lowest resolution/settings until we get up to the 2133MHz RAM which yields a 14% performance improvement. At medium and high resolution/settings, however, we get a very clear progressive improvement as we use higher frequency RAM. A 6 FPS difference between 1333MHz and 2133MHz RAM at medium resolution/settings is a pretty huge performance difference (roughly 33%) and is honestly larger than we expected.
Hitman: Absolution also shows a very clear progressive improvement. This time, even the lowest resolution/settings show a benefit to using faster RAM (up to 7.8FPS).
More of the same with Alien vs. Predator, except that the difference at the lowest resolution/settings is now over 14FPS, or a 41% improvement.
With the exception of 1920x1064,0xAA,4xAF which doesn't show much variation between the 1333MHz and 1600MHz RAM, the results are once again almost identical to the other game benchmarks.
These benchmarks clearly show an advantage to using faster RAM if you are using the onboard APU graphics for gaming, especially at the lower resolutions/settings. RAM is also fairly cheap, so it is a great way to get a decent performance boost if you are on a budget. The main caveat here is that as you get into the faster RAM (especially 2133MHz), the failure rate increases rapidly.
At the moment, 1600MHz RAM is at the point where we at Puget Systems consider it to be very stable. 1866MHz still has a bit too high of a risk for us to carry, but users who want the extra couple of FPS may be willing to take on the additional risk. 2133Mhz RAM, however, is faster than AMD APUs natively support, so we would recommend against using it unless you absolutely need that extra little bit of performance.
AMD Dual Graphics Information
SLI and Crossfire, which allows you to link two video cards together to increase performance, are both very established technologies, but AMD's Dual Graphics takes it one step further. Instead of requiring two discrete video cards, it allows you to link a single discrete card with the APU's integrated graphics. The theoretical result is performance that is greater than either graphics processor can provide by itself.
This technology has some heavy compatibility restrictions, however. According to AMD's Dual Graphics website, the recommended graphics card pairing for each A-series name is:
What makes Dual Graphics confusing is that there is no actual list of compatible video cards (even an unofficial one), just the list of recommended cards. It is also unfortunate that all of the recommended cards are at least a generation behind the latest, but so far AMD has not updated their recommended list to include the newer 7000-series cards. There have been some reports on the web that the 7000-series cards should work since many of them are based on the same technology as the 6000-series, but in our testing we found that to not be true. We tried to get an AMD Radeon HD 7770, AMD Radeon HD 7570, and an AMD Radeon HD 6750 to work with Dual Graphics, but none of them even gave us the option to enable Dual Graphics at all.
So, in addition to chart above being a list of recommended graphics card pairings, it appears that it is also a list of all the cards that are compatible with Dual Graphics, at least in relation to Trinity-based desktop APUs.
Enabling Dual Graphics is a very easy task. With the AMD APU display driver installed, simply plug your monitor into the discrete video card (this is so that if Dual Graphics is not supported by a title you will still get the full performance provided by the discrete card) and enable Dual Graphics in the AMD VISION Engine Control Center under "Performance". In fact, this was already enabled for us by default, so technically this step is more of a double-check to make sure that it is indeed running. Note that Dual Graphics requires DX10 or DX11, so older titles will only be able to use the discrete video card alone.
Dual Graphics Benchmarks
As we stated in the Test Setup section, we will use an Asus Radeon HD 6670 1GB video card for our Dual Graphics testing. We will pair it with our test system, including 1866MHz CL9 RAM with the shared memory set to 2GB. This is the fastest natively supported RAM and we wanted to make sure that the RAM did not cause any bottlenecking during our testing.
We will be running three rounds of benchmarks. One with just the APU's onboard graphics (7660D), one with just the Radeon HD 6670 1GB, and one with Dual Graphics enabled.
Our first benchmark doesn't look too great for Dual Graphics. At each resolution/setting, we saw lower performance with Dual Graphics enabled than just using the Radeon HD 6670 alone. The lowest settings were hit especially hard with a ~30% drop in performance.
Hitman: Absolution is a bit of a mixed bag. At the lowest resolution/settings (which is what most users would be running with this configuration) there is again a performance drop (roughly 22%) with Dual Graphics enabled. Interestingly, we did see an improvement at the medium and high settings.
Alien vs. Predator is pretty much a poster child for what we expected with Dual Graphics. Across the board, it increased the FPS by roughly 50%. What is great is that Dual Graphics allows you to comfortably play Alien vs. Predator at medium settings, whereas if you just used the Radeon HD 6670 you would likely notice some stuttering or lagging.
Not every title will work with Dual Graphics, and the Unigine Heaven benchmark shows this in a way that we expected. Across the board, there was no difference with Dual Graphics enabled compared to just using the Radeon HD 6670 alone. This is what is supposed to happen if Dual Graphics is not compatible with a game - the system simply uses the discrete video card exclusively.
Our results are not unique, and many reviews have run into the same performance issues that we saw with Dual Graphics. Hexus.net saw the same performance drop in Batman: Arkham City, but saw a large performance improvement in DiRT Showdown. Xbit Laboratories saw a performance improvement in Alien vs. Predator (roughly the same as what we saw) and Far Cry 2, but saw a performance drop in F1 2012 and no performance change in Borderlands 2. Tom's Hardware saw a performance drop in Metro 2033, an increase in performance in World of Warcraft, and no change in Call of Duty: Modern Warfare 2.
Aggregating all of these results together, we find the following performance changes per title when compared to using the discrete video card alone:
|Improves Performance||No Change||Decreases Performance|
|Hitman: Absolution (medium/high)||Unigine Heaven||Batman: Arkham City|
|Alien vs. Predator||Borderlands 2||Hitman: Absolution (low)|
|DiRT Showdown||COD: Modern Warfare 2||F1 2012|
|Far Cry 2||Metro 2033|
|World of Warcraft|
In our testing, we found that the largest impact on gaming performance with AMD's Trinity APU is the speed of the system RAM. As you would imagine, the faster the RAM, the better the performance. The one caveat to this is that faster RAM is typically more expensive and has a higher risk of failure. Especially after you pass the natively-supported 1866MHz mark, the risk of failure dramatically increases. If maximum stability is a concern, we recommend limiting yourself to 1600MHz RAM, but those that are willing to chance a slight increase in failure risk for a few extra FPS can certainly use 1866MHz RAM.
The amount of shared memory had an effect on performance, but the difference was dramatically reduced after 512MB. Especially on the settings that you would actually use when gaming with the onboard graphics of a Trinity APU, the difference is almost non-existent. However, to be safe we recommend setting the shared memory to 1GB.
Dual Graphics is a bit of a mixed bag and is frankly a bit disappointing. In addition to the very limited video card support (only the Radeon HD 6450, 6570, and 6670), our benchmarks and those performed by other reviewers show that you have a 41% chance of improving performance, a 25% chance of not changing performance, and a 34% chance of actually decreasing performance. Of course, if you only play titles that you know benefit from Dual Graphics it is definitely worth looking at, but overall we would recommend avoiding Dual Graphics.
Of course, if you can fit a mid-range discrete video card in your budget (something like an AMD Radeon 7770 or NVIDIA GTX 660 Ti), that alone will dramatically improve your gaming performance.