For a number of years now, 1080p (1920x1080) displays have been the norm for both televisions and computers. And while 4k displays may not replace 1080p as the standard for some time, they have recently been gaining in popularity. These displays have roughly four times the number of pixels as a standard 1080p display which results in a huge improvement in picture quality. But since the technology is still relatively new there are a lot of questions and misunderstandings about this new technology.
Since we have been receiving more and more questions about 4k displays, we decided it was finally time to get our hands on one and perform our own testing. The monitor we will be using for this is the ASUS PQ321Q 31.5 inch UHD Monitor which runs at 3840x2160 with either a 30Hz or 60Hz refresh rate depending on how it is connected to the computer. This monitor offers a range of connectivity options including DisplayPort in both SST and MST mode, HDMI, and dual HDMI. Unlike most 1080p monitors, the type of connectivity you use is very important as it will determine how you setup the display and what refresh rate the display will be able to run.
Currently both HDMI and DisplayPort is limited to 30Hz per 4k display so 60Hz monitors like the ASUS PQ321Q have to be a bit creative to get around this limitation. The way that this monitor and others like achieve 60Hz refresh rates is through the use of either DisplayPort in MST (multi stream transport) mode or by using dual HDMI cables to actually run two sub-4k resolution displays side by side on the one monitor. So you are essentially splitting the monitor in half right down the middle with the left side as one display, and the right side as a second display. By doing this, both displays are actually running at 1920x2160 which can operate at 60Hz with current DisplayPort and HDMI specifications.
At 4k Resolutions, you can easily fit our website side-by-side three times! Note that this image has not had the DPI adjusted so things like the Start Button and Taskbar icons are at their default size.
Click Here to view the full size screenshot.
To perform our testing, we used the following base hardware:
|Motherboard||ASUS Sabertooth Z87|
|CPU||Intel Core i7 4770K 3.5GHz|
|RAM||4x Kingston HyperX DDR3-1600 4GB Low Voltage|
|Hard Drive||Samsung 840 Pro 128GB SSD|
|PSU||Silverstone ST1500 1500W Power Supply|
|OS||Windows 7 Ultimate 64-bit w/ SP1|
|NVIDIA DisplayPort Driver||Beta 326.41|
|NVIDIA HDMI Driver||320.49|
Note that while we were able to use the latest official video driver from Intel and AMD, we had to use a Beta driver from NVIDIA in order to get DisplayPort in MST mode to function correctly. This is due to the fact that unlike Intel and AMD, NVIDIA currently does not have a way to combine two displays into a single seamless display like Intel Collage mode or AMD Eyefinity. There is NVIDIA Surround, but that currently only works with three monitors, not two. This Beta driver gets around this issue by simply detecting when you are using DisplayPort in MST mode and automatically creates a single virtual display instead of two separate displays.
Instead of going over how we setup the monitor for AMD and NVIDIA cards, we are instead simply going to direct you to the Setup Process portion of PC Perspective's recent Asus PQ321Q review article. We used this article as a basis for much of our process so we want to be sure to give credit where credit is due.
The one thing we are testing that PC Perspective did not touch on is setting up a 4K monitor with Intel integrated graphics. Luckily, this is very similar to how you setup a 4K monitor with an AMD card with the only difference being that you enable Intel Collage mode in the Intel Graphics Control Panel rather than creating a Eyefinity Display Group in the AMD Catalyst Control Center. The process is very easy; simply open the Intel Graphics Control Panel, navigate to the Display page, and change the display type to Collage.
Since the primary goal of this article is to determine what cards can be used with a 4K display, we used a variety of NVIDIA and AMD cards as well as the integrated Intel HD 4600 graphics that is found on the Intel Core i7-4770K CPU. Most of the cards officially support 4K monitors, but one thing we have learned from experience is that sometimes even if something is officially supported, it is not really in a useable form.
|Maximum Resolution||Digital Display||DVI||HDMI||DisplayPort|
|Intel HD 4600 (i7-4770K) 1GB shared||-||-||3840x2160||3840x2160|
|NVIDIA GT 640 2GB||4096x2160||2560x1600||-||N/A|
|NVIDIA GTX 650 1GB||4096x2160||-||-||N/A|
|NVIDIA GTX 660 2GB||4096x2160||-||-||-|
|NVIDIA GTX 660 Ti 2GB||2560x1600||-||-||-|
|NVIDIA GTX 670 2GB||4096x2160||-||-||-|
|NVIDIA GTX 770 2GB||4096x2160||-||-||-|
|NVIDIA GTX Titan 6GB||4096x2160||-||-||-|
|AMD Radeon HD 7750 1GB||-||2560x1600||4096x2160||4096x2160|
|AMD Radeon HD 7970 3GB||-||2560x1600||4096x2160||4096x2160|
The chart above shows the official maximum resolutions that Intel, AMD, and NVIDIA list for these cards. It gets confusing because NVIDIA just lists the maximum resolution which, based on current HDMI and DisplayPort specifications, should apply to both HDMI and DisplayPort. AMD on the other hand lists each output individually which in our opinion is the much better way to do things. Intel is similar to AMD as they list both the HDMI and DisplayPort maximum resolution, although they do not list DVI.
One very important thing to note is the red spec for the GTX 660 Ti which lists 2560x1600 as the maximum Digital Display resolution. This is strange since every other card we are testing can officially run at 4096x2160. We've long suspected that this is simply a mistake on NVIDIA's part, but until now we have not been able to test it. So in addition to everything else, this is a good chance for us to finally discover if this specification is actually correct or not.
Update 8/28/2013: NVIDIA has finally updated the GTX 660 Ti to have the correct maximum digital resolution of 4096x2160. So in addition to our testing showing that the GTX 660 Ti can do 4k resolutions, it is now officially supported by NVIDIA as well.
4k Monitor Connectivity Options
The first thing we want to do is simply see which video cards work with each of the different connectivity options for this monitor. For this test we are simply checking to see if we can get the full 3840x2160 resolution with either a 30Hz or 60Hz refresh rate depending on the official limitation of the connection type.
|Intel HD 4600 (i7-4770K) 1GB shared|
|NVIDIA GT 640 2GB|
|NVIDIA GTX 650 1GB|
|NVIDIA GTX 660 2GB|
|NVIDIA GTX 660 Ti 2GB|
|NVIDIA GTX 670 2GB|
|NVIDIA GTX 770 2GB|
|NVIDIA GTX Titan 6GB|
|AMD Radeon HD 7750 1GB|
|AMD Radeon HD 7970 3GB|
Starting with the DisplayPort in SST (single stream transport) and MST (multi stream transport) mode, the only cards that did not work at all were the NVIDIA GT 640 and the NVIDIA GTX 650 by the simple fact that they do not actually have a DisplayPort connector. The only other issue we had with DisplayPort was when we tried to use Intel Collage mode on the Intel HD 4600 graphics that is integrated onto the Intel Core i7-4770K. We were able to get the full resolution and refresh rate with DisplayPort in MST mode, but the display was extremely choppy to the point of un-usability. While we were using it, it felt like the refresh rate was closer to 20Hz rather than 60Hz. The strange thing is that this choppiness doesn't occur until you enabled Collage mode, so as long as you are fine with using the monitor as two virtual displays side by side rather than a single large display, it actually works great. Because of this fact, this issue is very likely driver based and as such should be fixed in a future driver update. Until then, however, we would only recommend using the Intel integrated graphics for 4k displays running at 30Hz.
The only other major issue we had was with dual HDMI. Current HDMI specifications only allows for 30Hz refresh rates at 4k resolutions, so this monitor gives the option to use use two HDMI cables working together to achieve 60Hz. This is actually very similar to DisplayPort in MST mode but is a bit funky since, unlike DisplayPort, HDMI was never intended to be used in this manner. In fact, none of the cards we tested even had dual HDMI ports so we had to try a variety of DVI to HDMI adapters and DisplayPort to HDMI adapters. Even with all the adapters we tried, we found no cards that worked properly. The AMD cards were the closest in working when we used a passive DVI to HDMI adapter, but even then we had issues. With this adapter we could get the Catalyst Control Center to report that it was running at 3840x2160 with a 60Hz refresh rate, but we were getting strange tearing when we would drag a windows across the middle of the screen.
This issue ended up being caused by the DVI to HDMI adapter. Apparently these types of adapters can only run at 60Hz with a resolution of 1600x1200 or below. Since the adapter was actually running at 1920x2160 it was actually limited to 30Hz even though the Catalyst Control Center says that it is running at 60Hz. So while the left half of the screen (with the native HDMI) was running at 60Hz, the right half of the screen (with the DVI to HDMI adapter) was actually running at 30Hz. Due this issue, we highly recommend using DisplayPort over dual HDMI in order to achieve 4k resolutions at 60Hz whenever possible.
4k General Usage Performance
Most reviews and articles about 4k displays tend to focus on gaming over anything else. While this is important, most of the time a display is used for much lighter tasks like watching movies or browsing the web. We often see high end video cards being recommended for use with 4k displays regardless of the usage of the system, so we wanted to find out if a high end card is really necessary when you are only doing light to medium duty tasks.
To find out, we used our lowest end AMD and NVIDIA video cards that could support 4k resolutions at 60Hz and compared them to the highest end AMD and NVIDIA video cards. We did three different tasks while logging the GPU, CPU and RAM loads to make sure that we were never close to maxing out any aspect of the system. In addition, we also simply paid close attention to see if we could subjectively "feel" any difference when usingthe lower-end cards.
For this testing, we first watched a slideshow of 4k images using the Windows Photo Viewer. These images were found simply by doing a Google Images search with the resolution of the images limited to 3840x2160. Our next test was to watch a 4k movie. To do this, we downloaded the Life in the Garden video from YouTube. We opted to download the video rather than playing it directly from YouTube to eliminate our internet connection from the mix. Finally, we simply browsed the web with Google Chrome for 15 minutes and visited a wide variety of web sites. To do this semi consistently, we first opened every post from the first two pages of the default Reddit front page with each post opening into a new tab that we never closed. This allowed us to view a wide range of different subjects from images, cat GIFs, videos, and articles while utilizing more tabs than most users usually do. After that, we browsed our own Puget Systems website for approximately 10 minutes doing things like configuring a new system, checking out part information pages and opening hardware articles. We are very familiar with how our own website performs so any slight hesitations or lag should be readily apparent.
|4k Slideshow||4K Movie||Intensive Web Browsing|
|NVIDIA GTX 660 2GB|
|NVIDIA GTX Titan 6GB|
|AMD Radeon HD 7750 1GB|
|AMD Radeon HD 7970 3GB|
Subjectively, we noticed no performance difference with any of the cards during any of our testing. The GPU load never got above 50% with any of the video cards, although the higher end cards understandably performed the same functions at a lower load percentage than the lower end cards. The only potential issue we found was during the 4k web browsing tests where our logging showed the video cards using as much as 1260MB of video memory. To put this into perspective, this is roughly 3 times as much video memory that would have been used if we performed the same tasks on a 1080p monitor.
Since the AMD Radeon HD 7750 only has 1024MB of onboard RAM, this is a potential problem as it is not able to store as much data in it's RAM as it ideally would like to. This didn't result in any noticeable performance difference, but if you want to be sure you are having the best possible experience we would recommend using a video card that has at least 1.5GB of video memory for each 4k display you will be using for these types of tasks.
4k Gaming Performance
There are plenty of benchmarks already available for a variety of games running at 4k resolutions, so we wanted to approach the question of gaming at 4k resolutions a bit differently. Instead of benchmarking a ton of games and video cards, we are instead going to compare how gaming on a 4k display compares to a standard 1920x1080 display. To see how much more demanding running a game at 4k resolutions is, we are going to run three benchmarks on the NVIDIA GTX Titan and the AMD Radeon HD 7970. If you would like to see benchmarks for more games, we recommend checking out the Tom's Hardware article Gaming At 3840x2160.
The settings for each benchmark was determined by adjusting them until the average FPS was about 30FPS when using the NVIDIA GTX Titan running at 3840x2160 at 60Hz. These settings ended up being:
|Benchmark Settings||DiRT Showdown||Hitman: Absolution||Unigine Heaven 4.0|
Starting with the NVIDIA GTX Titan, we saw a huge drop in performance just by upping the resolution to 3840x2160. Unigine Heaven had the biggest drop in performance going from 120 FPS to just 31.8 FPS which is roughly a 74% drop in performance! DiRT Showdown and Hitman: Absolution also showed significant drops in performance, although they were both closer to a 60% drop in performance.
Our benchmarks with the AMD Radeon HD 7970 at first glance are very similar to the NVIDIA GTX Titan benchmarks. In fact, the percentage drop in performance for Unigine Heaven is almost identical at 74%. However, with Hitman: Absolution we saw something different as it showed a 67% drop in performance rather than the 55% drop we saw with the NVIDIA GTX Titan. The reason for this lies in the fact that along with the higher demand on the GPU itself, running a game at 4k resolutions also requires much more video memory. In the case of Hitman: Absolution, the game actually wanted to use a bit more video memory than the AMD Radeon HD 7970 has available. The difference was only a little bit so the performance hit wasn't as bad as it could have been, but it was impacted enough to show up in our results.
What our testing shows is that simply upgrading to a 4k display should lower your performance in games by around 60% assuming you have enough video memory. So if you are running a game at 60FPS on a 1920x1080 display right now and just increase the resolution to 3840x2160 with otherwise the exact same settings, you should expect the FPS to drop to around 24 FPS. If you do not have enough video memory available, this number can easily drop even lower.
If you are curious about how much video RAM was used in each benchmark run, simply click on the thumbnails below.
Edit 8/8/2013 - We've had some questions about the performance difference between a monitor running at 30Hz and one running at 60Hz. Refresh rate should not affect benchmarking performance, but we re-ran some of our benchmarks with the monitor connected via DisplayPort in SST mode (30Hz refresh rate) just to confirm. Our testing showed that while games certainly look better at 60Hz, as far as benchmarking goes the refresh rate does not affect the FPS performance.
AMD vs. NVIDIA 4K Display Quality
One strange report we've heard many times since the advent of 4k displays is that AMD video cards output clearer text than NVIDIA cards. This has always been a source of confusion for us since the quality of text and other non-3d graphics is generally not affected by what video card you use but rather the monitor. To see if we could validate these reports, we took high resolution pictures (not just screenshots) of the screen running at full 4k resolution using DisplayPort in MST mode. For this test we used four video cards: AMD Radeon HD 7750, AMD Radeon HD 7970, NVIDIA GTX 660 and NVIDIA GTX Titan.
|AMD Radeon HD 7750||AMD Radeon HD 7970|
|NVIDIA GTX 660||NVIDIA GTX Titan|
|Comparison of image quality between AMD and NVIDIA video cards|
This is a very close up shot of one of the navigation tabs for our Serenity systems and includes both text and a small graphic. As you can see, there is no difference in the text or graphic quality between AMD and NVIDIA cards.
We also performed this test with the DPI set to 140 pixels per inch (which is what this monitor is) and while everything gets bigger as it should, the quality between AMD and NVIDIA cards is still identical. However, one major discovery we made is that if you have an AMD video card and change the DPI settings, the driver automatically increases the zoom level for internet browsers like IE, Chrome, and Firefox. In our case, when we set the DPI to 146% (140 pixels per inch), IE was automatically set to 145% zoom and Chrome was automatically set to 150% zoom. So very likely the reports of clearer text with AMD cards is simply the result of internet browsers automatically having the zoom level increased which makes text and images on webpages much larger and easier to read.
When using DisplayPort in MST mode, NVIDIA cards fail to POST about 50% of the time. When this happens, you have to hit the reboot button on the computer (power off then back on does not work) in order to POST successfully. PC Perspective also had this issue and reported it as an issue with the STMicro firmware on ASUS motherboards. This should be fixed in a firmware update in the near future.
NVIDIA and Intel displays in MST mode cannot see the BIOS or POST screens on our ASUS Sabertooth Z87 motherboard. DisplayPort in SST mode and single HDMI, however, work fine. This issue does not happen with AMD video cards.
Also related to DisplayPort in MST mode, before the NVIDIA driver is installed you only get a black screen when the OS boots. You have to switch to SST mode, install the NVIDIA driver, then switch back to MST mode.
- Finally, we initially had some problems when testing HDMI that ended up being due to the cable not being HDMI 1.4a compliant. So unlike SATA cables, make sure your cables are compliant with the latest specification.
Overall, we are very impressed with the picture quality on the ASUS PQ321Q. It is still too early to tell when or if 4k displays will replace 1080p displays, but if you need a high quality display and have a large enough budget, 4k displays are certainly the way to go. But like any new technology, there are a number of things that anyone considering a 4k display should take into consideration.
First, current driver and firmware revisions make AMD video cards a much, much more attractive choice over either Intel integrated or NVIDIA graphics if you want to use a 4k display with a 60Hz refresh rate. The problems we saw with both Intel and NVIDIA should be fixed soon, but at the time of this article AMD is simply more problem-free and easier to setup. On the other hand, if you are only using a 30Hz 4k display, none of the issues we saw will even be a factor. So in that case you should simply go with the video card that best matches your needs.
Second, if you are tying to decide whether or not a more expensive 60Hz 4k display is worth it over a 30Hz display, consider what you will be using it for. If it is primarily for watching 4k movies, a 30Hz display is likely just fine as most 4k movies are only played at 24FPS. For almost anything else, however, the 30Hz refresh rate is going to be very noticeable. One tool we've found to be useful for comparing refresh rates is the Frames Per Second website. Simply set the baseballs' FPS to 60 and 30 FPS, turn off motion blur, and play around with the velocity to see how much choppier a 30HZ display is compared to a 60Hz display.
Next, whenever possible we recommend using DisplayPort in MST mode to connect a 60Hz 4k display to your computer. If your display gives you the option to use dual HDMI, we strongly suggest not doing so. The fact that we couldn't get dual HDMI to work correctly on any of our test cards clearly shows how reliable that type of connection is right now.
Finally, if you are not gaming plan on needing at least 1.5GB of video memory for each 4k display you will be running unless your system is solely used for light tasks such as showing pictures or playing movies. In that case, 1GB of video memory per display should be more than sufficient. If you are gaming, the amount of video memory you need per display is going to depend heavily on the game but will almost certainly be at least 2GB. To be safe we recommend having at least 4GB of video memory per display, but if you play games like Skyrim with lots of mods you may find that you need as much as 6GB of video memory.
In conclusion, we absolutely loved using a 4k display and will be petitioning our company President to immediately replace all of the monitors in our office with 4k displays. Unfortunately, given that the ASUS PQ321Q currently goes for about $3500, that proposal is not very likely to be approved. But if a 4k display is within your budget, we highly recommend considering one the next time you are in the market for a new monitor.