Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/810
Article Thumbnail

GTX 1070 and GTX 1080 Premiere Pro Performance

Written on June 11, 2016 by Matt Bach
Share:

Introduction

Whenever new video cards are released, the very first thing our customers tend to ask is "how fast are they?" This is a great question, especially since all the online reviews tend to focus on their performance in video games rather than in professional applications. Today we are going to look specifically at Premiere Pro to see just how well the GeForce GTX 1070 and GTX 1080 perform in with a variety of footage codecs and resolutions.

 

 

 

With Premiere Pro, Adobe utilizes the Mercury Playback Engine which uses the video card to vastly improve the performance of certain features. This provides a tremendous boost to performance, but it adds more complexity to the question of "what hardware do I need" since you need to take into consideration how many accelerated effects you use. In this article, we want to explore the performance differences between the new GTX 1070 and GTX 1080 and the previous generation GTX 980 Ti and GTX Titan X using a variety of different codes and resolutions.

Test Setup

For our test system, we used the following hardware:

Since Premiere heavily utilizes the CPU in addition to the video card, we will actually be testing with two different systems. The first is a high-end dual Xeon system with two E5-2687W V4 CPUs. The second system uses a Core i7 6950X which is a bit more standard of a CPU choice for Premiere Pro.

The different video cards we will be testing are:

To help with consistency - and since the benchmarks we performed ran for several days - we programmed a custom script using AutoIt to start Premiere Pro, load the relevant project, time how long it takes to export the timeline with the appropriate settings or generate previews, close Premiere Pro to clear any data from the system RAM, then loop while changing the project file.

The files we will be testing with came from a variety of sources:

1080P H.264/CineForm
4K H.264/CineForm
Provided by: Jerry Berg
Barnacules Nerdgasm - YouTube
ProRes 4K Grant Petty
Blackmagic Design Forum (available for public download)

In order to make our testing as accurate as possible, we used relatively simply timelines for our testing in this article. In the past, we've loaded on the accelerated effects to show the maximum difference between cards, but we found that this was not representative of real-world performance gains. Our test timelines consisted of:

  • 4-5 clips arranged in series to make a 60 second timeline
  • A basic transition was applied to each clip
  • Lumetri color correction effect applied to each clip
  • Vector-based logo graphic added to the bottom corner of the footage

Exporting to 1080p - Single GPU

While more and more people are starting to shoot in 4K and higher resolutions, 1080p is still by far the most common resolution to export to. Because of this, we thought we would start our testing by looking at how Premiere is able to utilize a single GPU when exporting from a variety of resolutions and codecs to H.264 1080p:

2x Intel Xeon E5-2687W V4

Intel Core i7 6950X

Right off the bat, we are getting some very interesting results. With some codes (1080p H.264 especially), the GTX 1070 and 1080 are much faster than the older GTX 980Ti and Titan X. In fact, on the Dual Xeon with 1080p H.264 source footage, the GTX 1070 is about 9% faster than a GTX 980Ti and 3% faster than a Titan X. The GTX 1080 is even better clocking in at over 13.5% faster than a 980Ti and 7% faster than a Titan X!

However, for the other file formats we only saw minimal performance gains from the new cards. Overall, on the Dual Xeon system the GTX 1070 was actually a hair slower than the 980Ti, although it was still about 2% faster than the Titan X. The GTX 1080, however, was much better at about 3.5% faster than a 980Ti and 6% faster than a Titan X.

Unfortunately, the story isn't quite as good with the Core i7 system. In this case, the GTX 1070 is actually a bit slower than the older cards by about 2-2.5%. The GTX 1080 is still a hair faster than the GTX Titan X, but only by about 2%

Note: You may notice that the charts above (and the dual GPU charts in the next section) often show faster render times on the Core i7 system than on the Dual Xeon system. This is not a mistake! If you read our Adobe Premiere Pro CC 2015 Multi Core Efficiency (Update1) article, you will see that when exporting to 1080p, Premiere Pro is not very good at using a high number of CPU cores and will at times actually see a drop in performance with two physical CPUs verses just a single CPU. Since the Core i7 6950X has an all-core Turbo frequency that is .2GHz higher than the dual Xeon-E5 2687W V4 CPUs, this actually makes the Core i7-6950X faster when exporting to 1080p. 

Exporting to 1080p - Dual GPU

2x Intel Xeon E5-2687W V4

Intel Core i7 6950X

With two video cards, our results are quite a bit different than what we saw with a single GPU. This time, both the GTX 1070 and 1080 were about .5% faster than the GTX 980Ti and Titan X. This isn't much, and honestly is close enough that we would deem it a wash between the four cards.

On the Core i7 system, however, the two newer cards performed great. The GTX 1070 was about 1.5-2% faster than the GTX 980Ti and Titan X. The GTX 1080 was even better and outperformed the two older cards by 4-4.5%

The good thing to note is that if you compare these raw results to the results from our single GPU test, it becomes clear that utilizing two video cards when exporting to 1080p can result in great performance gains. On average, we saw about a 20% jump in performance on the Dual Xeon system, and almost a 50% increase in performance on the Core i7 system. So while the difference between the  pairs of cards we tested isn't much in many situations, simply having two cards in the first place works nicely when exporting to 1080p.

Exporting to 4K - Single GPU

Although 4K isn't as widespread as 1080p quite yet, more and more Premiere Pro users are either exporting to 4K or are considering exporting to 4K in the near future. Exporting to higher resolutions is more taxing on the system, so it will be interesting to see if the difference between each video card changes when exporting to 4K versus exporting to 1080p:

2x Intel Xeon E5-2687W V4

Intel Core i7 6950X

There were some odd results in this testing - primarily stemming from a couple of times where the GTX Titan X performed much slower than we expected. Overall, on the Dual Xeon system the GTX 1070 was a bit slower than the GTX 980Ti, but about 7.5% faster than the GTX Titan X. The GTX 1080 was similar, although it was about .5% faster than the GTX 980Ti and 8.5% faster than the GTX Titan X.

When using just the single CPU in the Core i7 system, however, the new cards performed much better. On this system, the GTX 1070 was about 8.5% faster than the older cards. The GTX 1080 was a even faster, beating the older cards by about 10%

Exporting to 4K - Dual GPU

2x Intel Xeon E5-2687W V4

Intel Core i7 6950X

With two video cards, there isn't really much to discuss. On the Dual Xeon system the newer cards were about 1.5% faster than the older cards, but on the Core i7 system they were about .5% slower.

If you again compare the single GPU results to the results in this section, you will notice that using two cards is indeed faster than just using a single card when exporting to 4K. Unfortunately, this time dual GPUs was only about 2.5% faster on the Dual Xeon system and about 10% faster on the Core i7 system.

Render Previews - Single GPU

2x Intel Xeon E5-2687W V4

Intel Core i7 6950X

When rendering previews, we saw some interesting results because - for whatever reason - the GTX Titan X was often fairly slower than the other cards. On the Dual Xeon system, this made the GTX 1070 about 8% faster than the GTX Titan X, although it was only roughly equal to the GTX 980Ti. The GTX 1080, on the other hand, was about 2% faster than the GTX 980Ti and about 10.5% faster than the Titan X.

On the Core i7 system, however, the GTX Titan X was right back in line. Because of this, the GTX 1070 was about 1% faster than the GTX 980Ti and roughly equal to the GTX Titan X. The GTX 1080 was about 4% faster than the GTX 980Ti and about 3% faster than the GTX TItan X.

Render Previews - Dual GPU

2x Intel Xeon E5-2687W V4

Intel Core i7 6950X

We saw pretty wide fluctuations in performance between each test, but on the Dual Xeon system the GTX 1070 was on average about 5% faster than the GTX 980Ti and about 15% faster than the GTX Titan X. For whatever reason, the GTX 1080 was actually a bit slower clocking in at about 2.5% faster than a GTX 980Ti and about 12% faster than a GTX Titan X.

 The results were not quite as impressive with the Core i7 system, but the GTX 1070 was still about 3.5% faster than the GTX 980Ti. The GTX 1080, on the other hand, was about 8.5% faster than the GTX 980Ti and about 4% faster than the GTX Titan X.

Once again, the big thing we saw was an excellent speedup when going from one GPU to two. In the case of rendering previews, we saw about a 30% increase in performance on the Dual Xeon system and a 40% increase in performance on the Core i7 system.

Conclusion

Summarizing all our results, we saw the following performance gains over the old cards on the two systems we tested with:

2x Xeon E5-2687W V4 Single GPU Dual GPU
Average Performance Difference GTX 1070 GTX 1080 GTX 1070 GTX 1080
Exporting to 1080p .5% slower than 980Ti
2% faster than Titan X
3.5% faster than 980Ti
6% faster than Titan X
.5% faster than 980Ti
.5% faster than Titan X
.5% faster than 980Ti
.5% faster than Titan X
Exporting to 4K 1% slower than 980Ti
7.5% faster than Titan X
.5% faster than 980Ti
8.5% faster than Titan X
1% faster than 980Ti
2% faster than Titan X
1.5% faster than 980Ti
2% faster than Titan X
Render Previews .25% faster than 980Ti
8% faster than Titan X
2.5% faster than 980Ti
10.5% faster than Titan X
5% faster than 980Ti
15% faster than Titan X
2.5% faster than 980Ti
12% faster than Titan X
Average  .5% slower than 980Ti
6% faster than Titan X
2% faster than 980Ti
8.5% faster than TItan X
2% faster than 980Ti
6% faster than Titan X
1.5% faster than 980Ti
5% faster than Titan X
Core i7 6950X Single GPU Dual GPU
Average Performance Difference GTX 1070 GTX 1080 GTX 1070 GTX 1080
Exporting to 1080p 2% slower than 980Ti
2.5% slower than Titan X
2% faster than 980Ti
1.5% faster than Titan X
2% faster than 980Ti
1.5% faster than Titan X
4.5% faster than 980Ti
4% faster than Titan X
Exporting to 4K 8.5% faster than 980Ti
8.5% faster than Titan X
10% faster than 980Ti
10% faster than Titan X
1% slower than 980Ti
.5% slower than Titan X
.5% slower than 980Ti
equal to Titan X
Render Previews 1% faster than 980Ti
equal to Titan X
4% faster than 980Ti
3% faster than Titan X
3.5% faster than 980Ti
1% slower than Titan X
8.5% faster than 980Ti
4% faster than Titan X
Average 2.5% faster than 980Ti
2% faster than Titan X
5.5% faster than 980Ti
5% faster than TItan X
1.5% faster than 980Ti
equal to Titan X
4% faster than 980Ti
2.5% faster than Titan X

If there is one thing we learned in our testing, it is that the performance difference between the cards can change drastically depending on what source codec and resolution you use as well as what resolution you are exporting to. So if you tend to only work with one of the codes we specifically tested, we highly recommend ignoring these averages and looking solely at that one test.

Averaging everything out, however, we found that

The GTX 1070 is about 1% faster than the GTX 980Ti and about 4% faster than the Titan X.
The GTX 1080 is about 4% faster than the GTX 980Ti and about 6.5% faster than the Titan X.

Again, this is only an average so it won't be true in all cases. Sometimes, the newer cards were actually slower than the older ones; other times they were as much as 20% faster. Also, keep in mind that the more GPU accelerated effects you use the larger the difference should become - so if you tend to use a lot of the accelerated effects the GTX 1070 and 1080 should provide an even larger benefit than what we showed in this article

One last thing we want to point out is that, while this article is primarily about looking at the performance of the GTX 1070 and 1080, one thing we did find was that dual GPU configurations can often work really well for Premier Pro. We didn't see much of a gain when exporting to 4K (only 2.5-10% better performance), but exporting to 1080p and rendering previews was anywhere from 20% to 50% faster with two video cards versus just one. 

Tags: GTX, 1070, 1080, Premiere, Performance
Greg Greenhaw

Can you try these tests with davinci resolve?

Posted on 2016-06-13 15:32:09

Resolve is on our list of software to test, but unfortunately right now we have a couple tradeshows coming up that require us to spend time on other software first.

The biggest hurdle with Resolve is that we don't have anyone on staff right now that is more than passingly familiar with it, so before we can do any in-depth testing we are going to have to devote a lot of time to simply learning it and making sure that what we test is applicable to the professionals actually using the software. If you have any suggestions on what we should include in our testing (specific effects, different actions, etc.), however, let us know! That kind of information would really help jump-start the process. Realistically, I don't expect us to be able to get to Resolve for another month or two though.

Posted on 2016-06-13 21:24:25
Greg Greenhaw

Read the configuration guide on their website this you can use this project

Here is one popular performance test.
http://www.carousel.hu/stan...

Posted on 2016-06-13 23:01:19
Jeff Stubbers

For those interested, we do now have a couple articles up regarding DaVinci Resolve.

DaVinci Resolve 14 GPU Scaling Analysis (1-4x Titan Xp)
https://www.pugetsystems.co...

DaVinci Resolve 14 CPU Performance: Skylake-X vs Threadripper
https://www.pugetsystems.co...

These can also be found under our Publications > Articles section of our site.

I hope this helps!

Posted on 2018-01-18 19:30:31
Przemyslaw Sroka

Hey! So, the cuda cores do not count? The gtx 1070 has much less cuda cores than 980ti for example (1920 vs 2816) and still going strong. Please tell me how it is?

Posted on 2016-06-14 10:45:54

The number of CUDA cores is really only about a third of the equation. The other two parts are the operating frequency where the 1070 runs at 1506-1683 MHz while the 980 Ti runs at 1000-1075 MHz. That is about a 50% increase in frequency, so each of those cores on the 1070 is theoretically 50% faster than the cores on the 980Ti. Just looking at the number of cores and frequency, this makes the 1070 and 980Ti roughly equal in terms of theoretical performance.

The last factor is simply the newer architecture the GTX 1070 uses. I don't know the details off the top of my head, but newer architectures can be much more efficient than older ones. I believe the GTX 1070/1080 architecture changes were mostly about lowering power consumption and increasing the raw frequency, but I'm sure there were plenty of performance enhancements as well.

So the increased frequency and newer architecture is really the two main things that allow the GTX 1070 to out perform the GTX 980Ti.

Posted on 2016-06-14 17:01:36
Przemyslaw Sroka

Just as i thought. Thanks for comprehensive answer, you helped me a lot!

Posted on 2016-06-15 07:37:14
Matt James

Impressive detail BTW. Your site has been very useful.

Posted on 2016-07-21 23:59:13
Milind Gavkar

Is this true even in CPU? Generally, AMD CPUs have more frequency comparing to Intel.

Posted on 2017-02-17 08:37:32

With both CPUs and GPUs you cannot compare things like core count and clock speed directly across different brands... or even different models / generations. That is part of why we do so much testing: to see what the real-world performance of various processors ends up being in commonly used applications.

Posted on 2017-02-17 16:45:43
Milind Gavkar

Thanks a lot for clarifying tech things!

Posted on 2017-02-18 18:57:26
Lukedriftwood

Thank you for the test, can you check if GTX1080/1070 support 10bit OpenGL via DisplayPort? I know 10bit DirectX is supported since GTX9XX, curious to see if GTX10XX series added 10bit in professional OpenGL environment.

Posted on 2016-06-15 00:24:27
Dragon

Add me in on that question. Probably not the case as 10 bit open GL has been the big reason to charge lots more for Quadro cards, but maybe with HDR NV changed their tune. It certainly would be good to know that answer.

Posted on 2016-06-16 19:57:19
i7Baby

I'd like to see a comparison with AMD cards eg http://www.anandtech.com/be...

Looks like AMD cards work better than nVidia

Posted on 2016-06-15 06:01:24
FJGC

I second the motion. This article is great, and I'm dying to find out how well the cheap RX480 8gb performs in this area. Can't really afford the more beastly NVIDIA options.

Posted on 2016-07-04 05:12:57
'Ainoa Manuia

same here. note that the Sony Vegas Pro 13 benchmark may be OpenCL accelerated and AMD cards can only favor the OpenCL API.

Posted on 2016-08-09 21:01:56
i7Baby

As well as including AMD cards in a future study, different editing software packages would be good too.

Posted on 2016-08-09 22:15:48
Fazizi Fauzi

thanks, i was looking for this everywhere. its looked like gtx 1070 just enough for my workstation (upgrading from gtx 960) and im not gaming much these day. could save some money for others thing. 5-10% improvement on 1080p or 4k just not worth it for another 1000 bucks (Malaysia Ringgit)

Posted on 2016-06-15 18:40:26
'Ainoa Manuia

Thanks you for this great data packed performance article

Posted on 2016-06-16 11:01:26
Greg Greenhaw

Will you have a thunderbolt 3, 3x GPU @ 16x option available soon? I would like one that is totally m.2 based and portable.

Posted on 2016-06-20 17:30:24
joms

Am I getting this right? For Premiere, the speed increase of the GTX 1080 vs the 1070 is very minimal considering the price it commands. I am only a hobbyist learning Premiere and I was looking at getting the GTX1080 but now i'm just considering the 1070 since the performance gain of the 1080 over the 1070 is only very minimal. I thought I would see a jump of around +60% if I get the much faster GTX1080. (By the way, I plan to edit 4k home videos on my i7-6800K/32GB ram/Samsung 950 SSD/Benq BL3201PH 4K monitor).

Am I right in just getting the GTX 1070 instead of the GTX1080 since the speed increase is only less than 5% if go with the GTX1080? Will this also hold true if I also use After effects (I plan to learn it as well down the line). Note: I am not a heavy Pro user. Only a hobbyist editing home videos. (My current vcard is a GTX 660Ti)

Posted on 2016-06-25 23:22:01

It sounds like a gtx 1070 would be a great card for you. The gtx 1080 is really only worth it if you need the absolute best performance (for some professionals, a few percent faster is well with the investment).

After Effects is a hard one. They just added gpu acceleration to a few effects in the update last week, but we haven't had chance to do any performance testing quite yet. I believe it is only for a few things like blurs and color correction though, so if you don't use that in AE then the GPU won't really matter for you right now. Adobe is obviously making efforts to add acceleration, however, so it wouldn't be a bad idea to have a decent video card for the future. With that said, I think a gtx 1070 should fare very well in ae both for the few accelerated effects now and when more acceleration is added in the future.

Posted on 2016-06-25 23:39:36
joms

Thanks for the reply. Compared to my current GTX 660Ti, I would see a big difference in performance if I get the GTX1070 right? (with regards to Premiere and After effects). Or would it also be a small 5% performance increase which in this case, I'd just use my old 660Ti. (Note: I don't play any games. Only 4K video editing)

Posted on 2016-06-26 00:17:33

I would expect a pretty good jump in preformance. How much is a really hard question to answer though. If i had to make up a number, maybe somewhere around 30-50% faster or so. It is going to depend on how many of the effects you use are able to use the GPU though, so don't hold me to that number! I could be way, way off, but that is my best guess.

Posted on 2016-06-26 00:23:52
Ray

I am running the following system and I am not seeing my graphics card being used at all in MSI Afterburner, neither the GPU% nor the GPU Core Clock are going above static.

Intel i7-5820k @ 4.7 OC (Stable on a 24 hour stress)
Asus x99-Deluxe II Motherboard
Corsair Vengence @ 32 GB CAS 15 RAM
MSI GTX-1080 @ 2113 OC (Stable on a 24 hour stress)

I have the Mercury Playback Engine GPU Acceleration (CUDA) option selected in the project file, but it does not appear as though the graphics card is actually being used. Any help would be greatly appreciated.

-Ray

Posted on 2016-07-01 20:20:30

It may just be that you are not using any effects that can take advantage of the GPU. Even just exporting to H.264 should use the GPU, however, so that is probably pretty unlikely. However, it may be that what you are using only minimally uses the GPU so the video card doesn't need to ramp up much at all. I would try putting something like a Lumetri Color Correction on a clip and seeing if the GPU load increases when scrubbing or exporting that clip.

Posted on 2016-07-01 20:28:20
Ray

I have been reading a bit since I posted and it looks like exporting H.264 does not use CUDA unfortunately.

I will try Lumetri Color Correction to see if it makes any difference to GPU usage.

What I generally do is load a few clips into Premiere Pro CC and cut them up to the flow that I like. I then dynamic link to AE CC and do my color correction with CC color correction as I like this better, I then use keylight to key out by green screen and then save and close. Going back to PP CC my file has been transformed and all I do from there is export H.264 with YouTube 4K settings. Maybe none of this takes advantage of the GPU?

Posted on 2016-07-01 20:53:11
a

You need to write your card model "GTX 1080" into Premier Pro systems folder (under C)

like that..... C:\Program Files\Adobe\Adobe Premiere Pro CC

Open cuda_supported_cards.txt file

and then write your card model name " GTX 1080 "

Posted on 2016-08-30 18:11:39
Michael Henderson

Any chance you could do some testing with 6k+ footage? I work with a lot of very high res image sequences of animations, some approaching 10k, and I'm curious if the more cores ever actually catches up with the faster cores.

Posted on 2016-07-04 01:05:38

We will be testing RED 4k and 6k (hopefully) soon. There is an Nvidia driver bug right now that is preventing is from testing RED footage with the new cards, so we opted to wait until a fix is released. Once we can, we will update this article with those results.

Posted on 2016-07-04 01:13:25
Matheus Siqueira

Thanks for the test! Will be waiting to see how the 1070 compares to the 1080 in Davinci Resolve! Cheers!

Posted on 2016-07-05 21:49:05
TechTac

It would be really great if you can compare it with the new AMD RX 480 opencl rendering time in Premiere Pro

Posted on 2016-07-15 23:58:40
Harold Mayo

How do these cards stand up to current workstation cards from AMD & nVidia? Currently 70% of my work is in Civil 3D & 30% is Photoshop.

Posted on 2016-07-25 02:15:50
Joshua Manor

Isn't the Samsung 850 Pro 512GB SATA 6Gb/s SSD creating a huge bottleneck?! The SATA limits the data to ~550MB/s. Maybe set up a Ram Disk so the storage dosen't come into play?

Posted on 2016-08-03 17:00:43

We've tested these timelines with up to a Intel 750 (about 2.4GB/s read, 1.2GB/s write) and there was no difference compared to a Samsung 850 Pro. So at least for what we tested in this article, the storage drive shouldn't have any impact on the results.

Posted on 2016-08-03 17:12:23
Joshua Manor

Good to know. Thanks for the info.

Posted on 2016-08-03 19:28:33
Richard Vala

Hey Matt,

I picked up a GTX1080 for my HP workstation 420... but the only issue is that I'm getting an CUDA unsupported message from both Adobe and Avid. Can you please let me know what needs to be done so I can enjoy the power of this card? The drivers are fully up to date. Not sure what else I can do.

Thank you,
-Richard

Posted on 2016-08-05 01:38:42

I can really only speak for Premiere (we haven't done rally in-depth testing of Avid yet), but you will get that message on older versions - CS6 and older I believe. It is just because back then they really wanted to encourage the use of Quadro cards. These days, Adobe is much less picky on which cards you can use. Either way, I wouldn't worry about that warning as long as it actually enables the Mercury Playback Engine. If you want to be 100% sure the GPU is being used, you could download a program like GPU-Z to ensure the GPU is being loaded.

Posted on 2016-08-05 19:30:31
Richard Vala

I'm currently on the most up-to-date version of Adobe. Adobe AE says (under the GPU information [preview preference] Device 1 (GeForce GTX 1080)(unsupported). I did a comp test render and nothing has changed in terms of how fast it gets through the comp compared to my old video card. Same with Adobe Encoder... no change in render time even though the CUDA option is selected.

With the GPU-Z, what setting do I check to see if the GPU has loaded?

I've updated my bios firmware as well... that didn't help. Would installing Windows 10 Pro solve the problem? I'm currently on Win7 Pro.

Posted on 2016-08-06 14:03:22

Hey Richard, I finally got a chance to actually run the new version of AE (2015.3) at the office with a GTX 1080 to check on this. I believe what you are looking at is the GPU support for ray tracing only, not the new general GPU support in AE that was just added. My understanding is that ray tracing from AE directly is considered by Adobe to be dated (they stopped development on it years ago) and should be completely removed in a future version. At this point, the new C4D integration is what I believe you should be using instead. If you really need to use the integrated ray tracing and want a supported GPU, you are going to have to use a pretty old card since Adobe's list of supported cards is pretty dated: https://helpx.adobe.com/aft...

For general GPU acceleration, you should enable it in Project Settings under the "Video Render and Effects" at the top. You want to have "Mercury GPU Acceleration (CUDA)" enabled. Be aware that right now, the only things that can use the GPU in AE are Gaussian Blur, Lumetri Color, and Sharpen. Everything else at the moment can only use the CPU. You can read more about this at https://blogs.adobe.com/cre...

As for GPU-Z, you want to switch to the second tab labeled "Sensors". On that tab, there should be a entry for "GPU Load". This is what you want to check to see if the GPU is being used at all. Generally, anything below ~5% I would attribute to simply updating what is displayed on the screen, anything more than that should indicate that the GPU is doing some sort of calculation.

Posted on 2016-08-08 18:52:25
Richard Vala

So finally here's the solution:

It's really simple actually... once you know what's up. :) You have to add the video card to the following config files:

Adobe After Effects:

C:\Program Files\Adobe\Adobe After Effects CC 2015\Support Files
raytracer_supported_cards

Then go into the preferences and make sure the CUDA option is selected the preview and the project settings. Bingo, Adobe AE accepts the card and you are good to go.

For Avid Media Composer:

C:\Program Files\Avid\Avid Media Composer\SupportingFiles\Config
QualifiedGpuBoards

Posted on 2016-08-13 19:55:26
Dragon

The Cuda engine was rebuilt for the 10 series cards and the compiler is still in beta. I have several pieces of software that don't support my 1080 yet for that very reason. Based on that info, it seems this test should be rerun after the compiler is released and Adobe has had a chance to implement the released version. The story may be somewhat different at that point.

Posted on 2016-08-07 16:57:04
'Ainoa Manuia

On the topic of dual GPUs, how do you set one GPU to draw the screen and one dedicated to processing? Just in general, not specific to Premiere Pro

Posted on 2016-08-09 20:59:13

That has to be an option specifically coded into the software. As far as I know, Premiere doesn't have that option so it will simply try to use every GPU installed in the system. Just as an example of what this option may look like in a program that does support it, in Iray (a CUDA-based rendering engine for things like 3d modeling and architecture) the option looks like this: http://help.autodesk.com/cl...

Honestly, I wouldn't expect that sort of functionality in Premiere (or any other Adobe product) anytime in the near future. GPU acceleration is still a relatively new thing for them, and until they have it optimized to the point that it is loading the GPU so heavily it bogs down the rest of the system I don't see them dedicating resources to that kind of feature. I personally would like the option because it would make our testing much easier, but at the moment I just don't see them adding it.

Posted on 2016-08-09 21:13:38
'Ainoa Manuia

thanks for the example, and the adobe comment. But in Iray, is the unchecked GPU 1 with "Used by Windows" means that it is responsible for drawing the desktop?

Posted on 2016-08-09 22:20:53

In that image, the Quadro FX 1800 is the primary GPU - so you are correct that it is what Windows uses to actually draw what you see on the screen. If you ever do get the option to select what GPUs you want to use, your primary GPU (whether it is labeled in the software as primary or something like "Used by Windows") is the one that you want to have unchecked if you do not want the software to disrupt you being about to use the system for other tasks while the software is under heavy load. You could uncheck others, but typically you would only do that if you have multiple GPU-intensive applications crunching at the same time.

Posted on 2016-08-09 23:12:29
'Ainoa Manuia

Just read up on a MicroWay's Tesla comparisons to GeForce cards and says "TCC allows GPUs to be specifically set to display-only or compute-only modes." This is part of it's unique GPU Health Monitoring and Management Capabilities. Of course the cost of a Tesla is a bit much to have that option.

Though a forum post got a Titan X to run TCC on Windows 7 x64 with two GPU's where one outputs video and other is the Titan X. Don't know if posting link is spam so his command was:

C:\Program Files\NVIDIA Corporation\NVSMI

and type

nvidia-smi -g 1 -dm 1

Posted on 2016-08-10 19:53:25
'Ainoa Manuia

Also, on the AMD developer website there is a hard coded option called GPU_DEVICE_ORDINAL which is an environment parameter to block GPU_0 from OpenCL calculations so GPU_1 does all processing

"As AMD Programming User guide (section "Masking Visible Devices" ) says:

'By default, OpenCL applications are exposed to all GPUs installed in the system; this allows applications to use multiple GPUs to run the compute task. In some cases, the user might want to mask the visibility of the GPUs seen by the OpenCL application. One example is to dedicate one GPU for regular graphics operations and the other three (in a four-GPU system) for Compute. To do that, set the GPU_DEVICE_ORDINAL environment parameter, which is a comma--separated list variable:

⁃ Under Windows: set GPU_DEVICE_ORDINAL=1,2,3

⁃ Under Linux: export GPU_DEVICE_ORDINAL=1,2,3'"

Posted on 2016-08-16 06:48:27
HaimanAskari

Great article. However, I still do miss a couple of details when it comes to these random tests made by random forums and sites. And that is setup details. Details as in size of videos, duration, filters and grading details. Also, how did you setup your Premiere pro? What presets? How did you manage to make Pr to run on your gpu?
I have the https://www.techpowerup.com... on https://www.asus.com/ROG-Re..., and my Pr won't even recognize the GPU. My CPU http://ark.intel.com/produc....

This is my complete setup on a completely new PC:

EVGA GeForce GTX 1070 Founders Edition

Intel Core i7-6800K Prosessor

ASUS ROG Strix X99 Gaming, S-2011-3

Crucial Ballistix Sport DDR4 64GB

Corsair Hydro Series H100i v2

Samsung 950 PRO 512GB M.2 PCIe SSD

EVGA GQ 850W Hybrid Modular 80+ PSU

Acer 28" LED CB281HKbmjdprx

Now one would think that with this setup, rendering would be like a walk in a park. But nnnnnnnnnoooooooooooooooooooooooo. Jeeezeeeee no!

Actually not much faster than my 12 year old stock PC.

So, when I read all these articles about how fantastic these cards are and how un-humanly fast the would render, I just get frustrated.
Reasons why I get so frustrated:
1; It's not true! Not in my case at least.
2; It works for everybody else, and they are super duper excited about it. Yes, almost giggly.
3; Nobody is willing to "reveal" the system setup. They just tell us, "hey- we did a test and it all looks great! Here look at these pretty numbers"

Well that's just swell. I'm sure it's all true and that everything works just fine for you. But please, do tell us how you came to these results.
When you put all of these components together, started up your new lightening fast pc, installed Pr and ran it; what did you do next?
What menus did you go to and what settings did you use, before importing your clip?
What settings did you use for exporting you video?
You get the idea as to where I'm going...

Best regards
Frustrated fella.

Posted on 2016-08-25 08:45:59

Matt (the author of most of these articles) could probably go into more detail, but we do provide a lot of info in the Test Setup section near the start of each one. That talks about where we get our sample files and what sorts of effects we apply. We also show what format the raw video is in and what we are exporting to in each test.

As far as I know there isn't anything "special" we do to Premiere Pro before testing. Back with CS5 and CS6, you had to add your video card to a list of supported cards manually (if it wasn't one of the few already on the list) so that GPU acceleration would work. That is no longer needed with CC, or at least that is my understanding and experience. You are right that your system should be pretty fast, given the specs you've listed. Have you looked at your CPU usage when rendering? That might help. Also, you mentioned that "Pr won't even recognize the GPU" - can you explain what you mean by that, and where you are seeing it?

Posted on 2016-08-25 16:47:55
HaimanAskari

I'm on Pr. CS6 Version 6.0.5.

GTX 1070 is not on the supported GPU list. Hence "won't recognize the GPU". So yes, I did have to add my card manually. Although that opens up the option to render with "Mercury Playback Engine GPU Acceleration (CUDA)", I'm not sure that it actually does just that.

The reason why I think that, is because when I render, I monitor my GPU load, and it barely hits 4% load. It's almost like the GPU is not even there or in any ways slightly effected by the fact that there's something going on, on the PC.

25 minute render time for a 3 minute 1920x822 clip, straight out of the camera. No filters, no color grading. Only change made to the clip is the crop factor. That's it.

I'm sure you understand why I get so frustrated when I read articles like this and see videos on youtube about how people render full 4k videos in the blink of an eye.

Best regards
Haiman

Posted on 2016-08-26 07:50:38

Ah, yes - you are on CS6, and as I mentioned that version had some special hoops you had to jump through in order to use GPUs that Adobe (at the time) had not officially certified / tested. It has been a while since that version was common, though, and I'm not sure if any of the old guides on how to add GPUs to the certified list are still around. Let me look...

Okay, I found it! Look at the "Correct Answer" on this thread, near the top and highlighted in green:

https://forums.adobe.com/th...

That should do the trick... though to be honest, I only know it worked for video cards that came out around the time CS6 was mainstream. I have no idea how well such an old version of Premiere will do with the brand-new GeForce cards. But that is your best bet. If it doesn't work, you could also upgrade to CC.

Posted on 2016-08-26 07:54:09
HaimanAskari

Thank you very much for taking time to answer me.

I have done all that. I think my issue is the old CS6.

Maybe CC will do the trick! :)

Thank you!

Posted on 2016-08-26 09:35:14
HaimanAskari

HOLY MOLY!! CC did the trick! Jeezaloo! That's what I'm talking about. Now I understand those "blink of an eye" render times everybody is talking about.
This thing is so fast ridiculously fast, it's almost no fun at all :D

Thank you again SOOO MUCH for your help on this!

If you ever need anything, holla at me. I got your back!

Best regards
Haiman

Posted on 2016-08-26 10:25:24

I'm glad that worked out for you! It is also good to know that the older CS6 (and so presumably CS5 as well) don't play nicely with modern video cards - even using the old tricks to enable non-certified cards.

Posted on 2016-08-26 16:16:33
HaimanAskari

Thank you so much.

You are right. CS6 will not work with newer GPUs. The old trick of adding GPU manually will only open up the option for "Mercury Playback Engine GPU Acceleration (CUDA)". Other than that, it makes absolutely no difference, what so ever.

Posted on 2016-08-27 13:42:35
SoNic67

So Adobe Premiere doesn't use the GPU at all (only for some effects that represent usually 1% in time budget) and also probably cannot scale past 12 cores. A CPU usage graph and GPU-Z one would reveal exactly how much of the hardware is actually used.
Ah, but that would hurt the sales... Yeap.

Posted on 2016-11-21 23:20:42

You are exactly right on the CPU - we pretty much never recommend a CPU with more than 10 cores (the i7 6950X) for Premiere Pro and even then only for those working with 4K+ footage. For the GPU, however, Premiere Pro benefits greatly from using a card for a ton of different tasks. Not only are a ton of major effects accelerated (which usually results in 5-10x better performance compared to using the CPU alone), but even things like down-scaling from 4K to 1080p utilizes the GPU. If you don't use any of these effects https://helpx.adobe.com/pre... then you probably don't need much of a video card, but even using one or two will allow a decent GPU to really improve performance when scrubbing and exporting.

Our newer article: https://www.pugetsystems.co... has a bit more in-depth testing that is better than this article - this one was more about getting benchmark numbers up when the GTX 1070 and 1080 first launched.

As for CPU and GPU utilization graphs, we've considered doing that but it turns out that they are not very good at providing relevant data. Especially in a program like Premiere Pro, you might only see the GPU being loaded to ~20% across all the different models we tested, but still see a huge drop in export times with a more powerful card. Usage graphs can be useful when one component is a major bottleneck, but they actually are not very reliable when trying to determine the difference in performance across different models of video cards or CPUs. For that, straight-up benchmarking is really the only reliable method we have found.

Posted on 2016-11-21 23:43:10
Matthew Temblor

I am trying to Choose between Nvidia GTX 1080 Founders or a Zotec 1070 8GB
(using with 6700K processor)
I am not building for gaming. Using for:
*2-3 monitors (1) 4k (2) 1080P.
* Blue Iris video surveillance system 6-8 cameras (4 HD)
*1080p video editing and occasional 4K Video editing..
(youtube or travel vids using CyberLink power director 15 Ultra)

Could I run BOTH the surveillance and editing programs at the same time?
I would also like to use a 50" plus TV to watch those edited videos. (although I do not have a 4K big screen yet)
I am a newb on editing 4k and multi monitor
setups.
I will probably run 1 of the of the 1080P monitors from the MB?
Should I consider a 2 cards? If I used a the Zotec 1070 what would I want to use for 2nd?

One more question Is a LG 27EA83R-D 27" LED IPS Monitor 2560 x 1440 a good monitor for my use? Or Should I be looking for a 4K monitor soon?
Thanks for any advice.

Posted on 2016-11-22 18:56:08
Matthew Temblor

anyone?

Posted on 2016-12-11 21:37:26

I like 2560x1440 personally, but if you want to edit in 4K then you might want to eventually have a 4K capable screen.

The GTX 1070 and 1080 will both handle the three monitor configuration you described. The 1080 is a little faster, if you are using software that will benefit from it, but I don't think Blue Iris does... and I haven't seen any benchmarks of Power Director. Personally, I would scale back on the video card (to the 1070) and spend the money you save on a CPU with more cores. That will help with video editing, and also running both applications at the same time. If you've already committed to the 6700K and/or a motherboard for it, though, you'll be locked in there.

Posted on 2016-12-12 06:38:08
Rostislav Alexandrovich

How about 1060?
for a laptop i7-6700HQ wouldnt it perform as good as 1070? (given the cpu is the bottleneck of ppro)

Posted on 2017-01-05 21:47:16

We actually tested the GTX 1060 in our updated article (the link is at the start of the article, or just go to https://www.pugetsystems.co.... The two places you are going to see a performance drop with the GTX 1060 is rendering previews (and live playback) using 1080p footage and exporting with 6K+ media. If live playback isn't a problem with your compositions, the GTX 1060 is probably OK as long as you get a card with 6GB of VRAM. Some GTX 1060 models only have 3GB of VRAM which really isn't enough for anything beyond very basic editing in Premiere Pro.

Something to keep in mind is that with laptops you have to be careful of whether the GPU is a full desktop version stuck in a laptop or if it is actually a mobile version (like the GTX 1060m). The mobile versions of video cards are really different than the desktop models and should really be thought of as one model below their desktop counterparts. So a GTX 1060m will perform more like a desktop GTX 1050.

Posted on 2017-01-05 22:05:36

I wish the Quadro was on here :/

Posted on 2017-01-24 00:42:08
Janis Ventaskrasts

Hello everyone!

I'm about to build my first PC and was just wondering, as I will be building the PC primarily for Photoshop/Lightroom use and just getting into/want to start to learn video editing, is it worth for me to purchase GTX 1080 or just go with 1070 as the price difference is 230 Euros?

Posted on 2017-02-09 11:00:37

I think that depends on how deeply you think you will be getting into video editing. Photoshop and Lightroom really won't see a difference between the 1070 and 1080, and Premiere will really only make a significant different if you are doing a bit more complex projects (4k, lots of effects, etc.). If you are just starting and are concerned about your budget, I would go with the GTX 1070.

Posted on 2017-02-09 21:12:56
Janis Ventaskrasts

Thank you for your suggestion! :)

Posted on 2017-02-19 17:22:49
Mike

I might be missing something here so if someone could please fill me in, when exporting in H.264, why is the export time for 4k>1080 significantly shorter than 1080>1080 in every configuration shown? I find that very odd.

Posted on 2017-03-24 02:25:35

That is weird.. I double checked our test logs, and those results are accurate. Not sure why it would be, must be some weird quirk of Premiere.

Posted on 2017-03-27 17:35:27
J. Peterman

Hi. I'm having some unexpected low performances with the MSI GTX 1070 on Premiere 2017. Here's the catch, I have it installed on my old PC with a Quad Core Q9550. Could be that?, the playback goes fine, but with a simple effect applied like reverse speed in Premiere, it stalls or goes on sloppy playback, and multi-camera editing is impossible (and even with no effects applied, the playback start after a second) I just have to say goodbye to my PC and get some updates there? A couple of versions back of premiere, I was capable to edit multi-camera with a 660Ti, that's the weird thing. Thanks everyone.

Posted on 2017-04-01 15:20:22
David Focardi

Hello,

I am looking to upgrade and was looking at some of your workstations. Most of my work is in AE and often requires 3D, utilizing Ray-Tracing features and what not. I've been reading online that the 1060-70-80 series GTX GPUs are not supported yet with all the CUDA etc? Is that correct?

I'd like to get a system with 1 or 2 1070-80s, but am concerned I will just get a black screen and not work. Did you run any tests in AE with these cards? Do they work?

Thank you so much.

Posted on 2017-04-07 17:47:37

Hey David, if you are talking about the older "Ray-Traced 3D", you have read correct that newer video cards are not supported (we have confirmed this ourselves as well). We have some information on that on our After Effects workstation page https://www.pugetsystems.co... but the short of it is that Adobe has pretty much discontinued development of that ray-tracing engine and replaced it with the newer integration with Cinema4D. Any NVIDIA GPU that is a 700 series or newer probably won't work, and I wouldn't be surprised if Adobe completely removes the old ray-tracer in the near future. So I expect that you (and everyone else) will be forced to moving to the C4D raytracer at some point.

Luckily, using the new Cinema 4D ray-tracer should simply be a matter of selecting it in the composition settings: https://helpx.adobe.com/aft... . One thing to note is that this is a CPU-based rendering engine so a more powerful video card won't do anything for performance.

If you've been looking at our systems, I really encourage you to give us a call at (425) 458-0273 . Our consultants are excellent, not on commission (so no pressure) and should be able to answer any other concerns or questions you might have.

Posted on 2017-04-07 18:10:26

Piggy-backing on Matt's answer, I wanted to add some more info about video card selection - not specifically related to 3D work, but other aspects of After Effects.

Only a few things in After Effects are GPU accelerated yet, but the effects which are see a huge benefit from having a dedicated video card in the 1060-1080 range. There is very little difference between those card models, though, in terms of After Effects performance. Here is an article on the topic:

https://www.pugetsystems.co...

Posted on 2017-04-07 18:23:16
Luiz Silva

I'm rendering 1080p videos but I see no difference when I turn on the cuda thing. I'm running Premiere Pro CC 2017 and my rig has a 1070.
I'm not adding effects or anything just doing ordinary stuff like cutting out unwanted parts and transcoding it to h264.
Both the GPU drivers and the Premiere pro are all updated but GPU usage won't go above 2% (which means the program is not using it at all while rendering). Please help me, I don't know what else to try. I wanna use my GPU to boost up the process. Thanks!

Posted on 2017-04-15 08:51:59
Alexander Varga

best vid card is hd4600 1min in 30sec.

Posted on 2017-10-23 18:39:23
KC Blake

I'd love to see an update to this article especially since the release of the new 1070ti. For Premiere/After Effects, is it better to buy two 1070ti's or one 1080ti?

Posted on 2017-11-10 19:52:21

We have a number of newer articles available, but the latest is this one: https://www.pugetsystems.co.... It doesn't include the 1070ti, but you can get a pretty good idea since the performance delta between the 1070 and 1080 is fairly small. We are due for another update which we will likely include the 1070ti on, but I'm not sure when that will be.

As far as multiple video cards, we haven't seen any benefit once you get above a GTX 1060 or so and even with low end cards the benefit is very small. So it should be much better all around to get a single GTX 1080i instead of multiple GTX 1070 ti.

Posted on 2017-11-10 20:07:39