Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/1211
Article Thumbnail

Video: H.264 Hardware Acceleration in Adobe Media Encoder - Good or Bad?

Written on August 2, 2018 by Matt Bach
Share:

Video Summary

In the update notes for Media Encoder and Premiere Pro CC 2018, Adobe has listed support for "hardware accelerated" when exporting to H.264 and H.265 when exporting. What this feature does is to utilize the Quicksync functionality built into some Intel CPUs in order to dramatically decrease the time it takes to export to these codecs. At first glace, this feature provides a huge boost in performance for many users, but unfortunately it is not perfect.

The first issue is that it requires a CPU that supports Quicksync and you also have to make sure the onboard graphics on your motherboard is enabled. You do not need to have a monitor plugged into the onboard graphics, it just needs to be on so that Quicksync can be active. The problem is that most of the best CPUs for Premiere Pro and Media Encoder (the Intel X-series) do not support Quicksync so many high-end professionals simply won't be able to take advantage of it.

The second issue with hardware accelerated encoding is that it is not the same quality as using the normal "Software only" mode. In fact, in our testing we found that the target bitrate was not always being met when using hardware acceleration. And even when it was, the quality was still lower than "Software only" mode. The odd thing was that when the bitrate was being matched, the time to export was identical in both modes - so we only saw a performance gain when the final exported file was at a lower actual bitrate.

If you are simply looking for the fastest encoding times (proxy generation, etc.) of H.264/H.265, hardware acceleration is a pretty neat and useful feature. If quality is a concern, however, it may not be worth the loss in quality unless you are on a tight deadline.

H.264 Match Source - High Bitrate

 

Software Only
33 Sec export - 23.8MB exported file
[View Full Image]

Hardware Accelerated
33 Sec export - 23.9MB exported file
[View Full Image]

H.264 High Quality 2160p 4K

Software Only
58 Sec export - 183.4MB exported file
[View Full Image]

Hardware Accelerated
31 Sec export - 34.4MB exported file
[View Full Image]

Premiere Pro Workstations

Tags: Media Encoder, Quicksync, Hardware Acceleration.H.264

I can not believe Adobe never warned about this. So ridiculously unprofessional. Nice findings, Puget, guess I'll go and disable this.

Posted on 2018-08-02 18:23:18

I wouldn't jump to conclusions so quickly; encoding hardware acceleration is an extremely valuable feature that Adobe UNDER-played at launch. These results are not comprehensive, though a helpful start - and besides the time factor (are you in a rush or not), another important factor is simply whether or not you're pixel peeping down to that level. Whether anybody would ever notice is an unavoidable question.

And this also focuses in on codec encoding acceleration even though such nominal degradation caused by hardware acceleration has also loomed over the usual kind of hardware acceleration from GPUs that Puget Systems actually never attacks, yet always incorporates into its tests. Put another way, there's prior evidence widely discussed -- again, of tiny differences -- showing that GPUs encode worse than software-only.

If there's any work to be done first, it's the target bitrate issue that appears to be a bug in the Adobe encoding hardware acceleration design. By getting the target bitrate right, these apples-to-oranges comparisons would become more apples-to-apples: comparing hardware versus software codec encoding with the same RESULTANT bitrate.

More here: www.focuspulling.com/cc2018

Posted on 2018-08-02 18:34:49

Sure, I can see this feature being useful. I'm mostly just disappointed that Adobe never acknowledged this in the patch notes or Media Encoder's UI itself. Without prior knowledge of this, it might appear as if you should expect same quality from both options. Personally I don't mind waiting a bit longer, for a better quality off of similar bitrate and file size. I mostly render After Effects projects, so I don't think I'm affected by faster H.264 encoding as much as Premiere.

Posted on 2018-08-02 20:13:37

I just want to point out that the "H.264 Match Source - High Bitrate" preset did actually use the correct bitrate in both "Software Only" and "Hardware Accelerated" mode. So that test is definitely an apples-to-apples comparison and the quality difference is pretty easy to see. Interestingly, that is also the one where we didn't see any performance gain at all. It makes me wonder how much of the performance gain people talk about is actually due to Media Encoder not actually setting the bitrate correctly so people are actually comparing two completely different export settings without being aware of it.

Posted on 2018-08-02 20:18:19

Good point; I'm getting different results than yours, though, when it comes to achieving a high target bitrate that's equivalent between hardware vs. software codec encoding, but vastly faster codec encoding in hardware mode. Pixel peeping down to magna-zoom levels can vary greatly depending on the footage, but that's where I'm headed next. I don't think it's fair to be conclusive at this point -- yet Adobe itself doesn't seem to care either, having totally buried this feature launch (purposefully?).

Posted on 2018-08-02 20:41:42
DeicideHD

Quicksync also doesn't stand on its own 2 feet next to AME or NVenc offerings in hardware encoding. So the entire concept of hardware support is still amazing as you suggested.

Posted on 2019-03-05 11:56:22
npcomplete

It is something that people who are familiar with encoding have known for a very long time. Have you ever looked at the doom9.org forums?
Anyone who uses x264 / x265 / ffmpeg knows the hardware encoders are crap. Quicksync is better than NVENC and AMD's encoder, but all 3 hw encoders are total crap, worsening with lower bitrates especially for difficult scenes. Unfortunately so few professional videographers and editors realize this.

Posted on 2018-08-19 16:38:59

You're being unreasonably (and transparently) dramatic, proclaiming that few of us know this, and that everyone in the world would compare the differences and conclude that hardware-encoded exports "are total crap." Even the flawed pixel-peeping referenced in this article here, is barely enough reason to forego the dramatic speed improvements for high-productivity environments. The differences significantly narrow when you use common codecs like XAVC or other typical compression formats that already compromise image quality -- normal folks live in a world of trade-offs.

Posted on 2018-08-19 19:56:47
pongobyte

One can only speculate as to why did Adobe release this now and at the same time keep it under wraps? This seems like a significant change indeed and the performance cannot really go unnoticed, bar the bitrate settings.
Talking about bitrate, while can understand the quality loss due to hardware encoding, be it QuickSync or NVENC, not sure about the results for Match Source Rate target 10Mbps, this is usually HD quality preset. Being 4K exports isn't the target supposed to be (a lot) higher?
There is a video from the guys at GamersNexus testing a 1080p video and same as here, HW encoding time is almost identical compared to SW only encoding - thing is that was a simple timeline, with no effects. You are testing with Red footage compared to 1080p, does this point to QuickSync being more efficient at higher bitrates, regardless of resolution?

GN also make a point that the digital artifacts / loss of quality may not be so important for some people and this feature can potentially make a significant impact on the decision to purchase a certain hardware configuration - for example, the 8700K will get a lot closer to the Intel-X and TR, if only in terms of exports.

On another point, Nvidia, of course CUDA still plays a role here and it's not just for encoding. Have you noticed any changes in the dGPU utilization when the iGPU is enabled as well, either during exports or regular usage in Premiere? It may seem like 2 competing resources for the same task, so it'd be interesting to gauge the impact on each other.

Lastly, wonder if the likes of FCP, Vegas or other NLE have this QuickSync acceleration feature enabled already.

Thank you again.

Posted on 2018-08-03 03:23:20

GN? No idea what that is.

Posted on 2018-08-03 04:56:22
maratropa

as far as i know FCP uses quicksync for decode & encode on mac's to achieve max speed on lower end cpu's

Posted on 2018-08-06 08:40:04
npcomplete

For Nvidia, the default and preferred method is to use NVENC, which is the fixed function encoder block and does not use CUDA at all. The fully accelerated encoders in nvidia, intel and gpu do not use the GPU's shaders, they are relegated only to the fixed function encoding block.

Posted on 2018-08-19 16:41:18
RandomBanana

A couple considerations:
1) Did you compare more than just the first frame in the video to see if the quality delta exists throughout the clip? In my own testing, I found comparing a single frame can be misleading with these Long GOP codecs. Frame to frame, the winner can flip back and forth throughout the clip.

2) For your second test you seemed to have encountered a bug with QuickSync encoding at high bitrates. If you chose a preset such at YouTube 4K, which targets 40Mbps, you'll see that the export file is much larger and better in quality. After some quick googling there might be a bug/artificial limit of ~65Mbps with QS but I haven't tested that myself.

Posted on 2018-08-05 20:52:52
mclaren777

I hope someone from Puget Systems will reply to your points.

They both might have merit.

Posted on 2018-08-16 05:19:03
Lars Emanuelsson

Still no answer. Maby a bitrate under 65Mbps did the trick?
I'm about to build a 4k video editing pc and try to find more information about this since it will save a lot of money on the CPU.
I can't find any other report about quality loss when using Igpu hardware encoding so I guess the exaple in this article was just bad luck.

Posted on 2018-10-31 21:17:30

Personally I'm resolved to completely ignore these results and keep on the lookout for any other evidence. So far, nothing besides microscopic pixel-peeping.

Posted on 2018-10-31 21:19:43
Amalkumar

Ryzen Or Intel Best For Premiere pro CC ??

recommend Best Cpu For Edting My pc rig Ryzen 1700 lagging For HD
Pls Help ME

Posted on 2018-08-10 14:41:46
Nate Porter

I'd love to see an analysis on davinci resolve's hardware acceleration for encoding now that it supports it as of 15.2

Posted on 2018-12-04 16:04:03
amanieux

does it makes any difference when you export for youtube as youtube re encodes videos at even lower bitrates ? has someone downloaded youtube videos to check bitrates ?

Posted on 2019-01-07 10:48:44

I'm actually working on an update to this right now since the latest versions of Premiere Pro and Media Encoder are supposed to have improvements for H.264 Quicksync hardware acceleration. As a part of it I was going to do a Youtube upload and grab screenshots directly from Youtube to compare. As long as I don't run into a complication (or if the updates made it so that the quality loss is smaller than it was in this testing), hopefully that will work.

Posted on 2019-01-08 21:06:25
Ian Green

I have Handbrake which is open source and it can encode anything on hand fairly good

Posted on 2019-02-28 03:27:30
J Dor

In the light of the new Adobe 2019 NAB update, has this issue been resolved, or is hardware acceleration still much poorer, not respecting the target bitrate etc.?

Posted on 2019-04-04 07:31:43

Hardware acceleration at rendering H.265 and H.264 has never been "much poorer" and this flawed comparison is indeed quite outdated now.

Posted on 2019-04-04 16:52:56

We have found that much of the quality issue with Hardware Accelerated encoding is due to it being limited to ~60mbps. If you try to set it higher than that (which the "H.264 High Quality 2160p 4K" preset does), Premiere Pro and Media Encoder don't follow the target bitrate and revert back to a very low bitrate. This explains both the low quality and faster performance found in our testing.

From everything I've tried, it looks like at least in Premiere Pro and AME, hardware accelerated encoding isn't that much faster if you have a relatively high-end CPU like a Core i9. Lower-end CPUs should still see a decent performance benefit, however.

Posted on 2019-04-04 17:03:22

Appreciate this updated information, if it is not tentative and is proven by lab results -- but 60 Mbps is a very high threshold for the alleged problem, that implicitly defies the whole point of using the H.264 and especially H.265 codecs designed for portability and efficiency. Almost all of the UHD presets offered by Adobe (e.g., Vimeo) do not come near 60 Mbps (e.g., 48 Mbps) and that is the real-world use case, where the hardware acceleration is warmly welcomed and proven to work well.

Posted on 2019-04-04 17:12:01
Novgorod

This issue is a well-known QuickSync "bug" found at least 5 years ago: https://software.intel.com/...
The bit rate is limited to ~65MBit/s because some SDK dev had the foresight to store the bit rate (in kbps) as a U16 variable, because I guess 2 extra bytes would have blown the budget in 2011. So naturally the bit rate is limited to 65536kbit/s and a higher value will simply overflow it (mod 2^16). There seems to be a workaround by using some multiplier but I doubt Adobe had figured this out by now, as it's only been 5 years or so. It also seems to depend on the actual CPU (Haswell, Skylake, Kaby lake etc.), so the implementation has to take care of that. QuickSync is intended for real-time encoding on the cheap, especially for live streamers, so it's a bit of an abuse to use it in professional exports.

Nonetheless, I actually wouldn't mind some loss in image quality at the same bit rate if it's significantly faster. On my "ancient" 6700K I get at least a 4x speed increase over software encoding, so I actually would want to use it. But I also like to offset the loss in quality by using a higher bit rate and the 65MBit/s limit is a pain in the dong, especially since it's purely artificial.

Posted on 2019-06-18 13:28:48

In nearly every use case -- and in practical reality, subject to A/B tests of moving video -- there is no real "loss in image quality" via QuickSync. Puget strangely got motivated to discourage using QuickSync rather than to encourage it (hard to explain unless it's in their financial interests to do so -- when it comes to 4x speed as you mentioned, I guess it damages those small fringe performance gains at the tops of product lines where profit margins are best).

Posted on 2019-06-18 16:02:07
Novgorod

Then why use 4K if 1080p looks just fine? Everybody knows the pros and cons and should be mature enough to make a decision what's the best trade-off between quality and practicality. Also, everyone is aware of the diminishing returns in the "ultra-high-end", i.e. of course you don't gain 4x the quality for 4x the encoding time - but that's completely beside the point of this discussion. We're talking about an actual bug or at least a lazy implementation (thanks, Adobe & Intel!), which takes away this choice from us.

Apparently QuickSync was never meant for "professional"-grade encoding (though it can very well be sufficient with a high enough bit rate), so the manufacturers don't really care and the "support" for it was just to throw us a bone. Your rage should be with Intel/Adobe, not with Puget, buddy :)...

Posted on 2019-06-18 16:34:39

There is no conceivable/hypothetical (let alone realistic) scenario where that mystic "professional" tier of quality is even perceptible in moving images - and as these comments bore out, the subject test was flawed, and limited in its relevance because of chosen acquisition codec, etc. That said, I agree that there's improvement in store for lifting the bitrate ceiling; I'm still stuck at 6700K like you, since Intel (because of greed essentially) has stalled at that architecture ever since. Sadly, this will all become irrelevant to me as of 7/7 when I abandon Intel (and this QuickSync option) for the first time ever, and move along to AMD's Zen 2. Puget would do well to consider shifting its priorities to the emergent best CPU becoming available...and it ain't Intel.

Posted on 2019-06-18 16:41:43
Novgorod

If you mean this video, of course it's flawed and borderline unscientific because it not only doesn't mention the bit rate bug but compares the quality between different bit rates. The low bit rate comparison (10MBit/s) was correct though and the hardware encoder is noticeably worse but that's not a realistic use case, since even Youtube uses 40MBit/s for 4K. At such a low bit rate (for the resolution) the compression artefacts become very nasty. The difference in quality is likely less severe at a bit rate suitable for 4K and they didn't show that.

As for "professional" grade quality, there is always an argument and use case to go for more. If you just go by mainstream perception as you seem to suggest, then noone needs anything more than 1080p @ 10MBit/s or so, as I already mentioned. Modern cameras (even "advanced" hobby grade) record 4K 30fps at 100MBit/s and more for a reason and you don't want to degrade it in the editing process more than necessary. I like to keep a "master" copy exported at the bit rate of the source just in case (e.g. if it's something less important, I tend to get rid of the source footage after a while).

Posted on 2019-06-18 17:14:11

These are good points, Novgorod, agreed. And I am only beating back against the conceit expressed here that the hypothetical "professional" would not use QuickSync acceleration (and that there's any credible quality difference), while the specification examples you offer are radically more dispositive and accountable.

Posted on 2019-06-18 18:20:34

I debated leaving this comment thread alone since these types of discussions often devolve quickly, but I do want to set a few things straight:

First, it is definitely not in our financial best interest to discourage hardware accelerated encoding. The only way that could be true was if our customers had an unlimited budget, but I can assure you that that is rarely that case. If a customer has a budget of X, it is in our best interest to get them the best possible system inside that budget. Moving them from something like a i9 9900K to an X-series CPU simply means they have less of a budget to get more RAM, storage, or something else. So even ignoring the fact that our #1 goal is to give our customers the best product and experience possible, there is really no financial incentive to push someone into a higher-end CPU that isn't going to do anything for them.

That is one of the reasons I'm very excited about AMD's new CPUs that are coming out. We do a lot of testing and validation to determine what brands and models perform best in the real world for each of the software packages we target and it really doesn't matter to us if it is an Intel CPU or an AMD one. We're partners with both companies, so it truly just comes down to which gives the end user the best performance for their budget.

This post really wasn't intended to be a massive, all-inclusive analysis of software vs hardware encoding - it was just to demonstrate that there is a quality difference between the two. Whether it is enough of one to matter is 100% up to you. Everyone is going to have a different opinion and priorities, and that is completely fine. I do want to re-do this testing at some point since we are now aware of the bitrate limit and I want to include a video with higher motion to see if that affects anything.

Posted on 2019-06-18 17:17:55
Myke Scaffidi

I would be very interested to read your updated findings. I am trying to decide between the 9900K and the 3900X for a system build that I would like to begin as soon as more of the facts are in regarding the AMD offering. I frequently ingest and export h.264 and h.265 footage - however I am concerned about the potential quality degradation using QuickSync. Hardware acceleration for h.264 was one of the reasons cited for choosing Intel in the 2019 Premiere Pro CPU comparison that was published yesterday. Have QuickSync quality concerns become less significant with the release of the 9900k?

Posted on 2019-07-23 17:10:25

For playback, I don't think the quality loss is important (honestly not even sure if there is a quality loss at all for that). So if you are working directly with H.264 media, that alone makes Intel a very strong option. For exporting, it really depends on where it is going IMO. Everyone's level of what is good enough is going to be different, but if your footage doesn't have a lot of movement (interview style or similar) and is going on Youtube, I think using hardware acceleration is going to be OK.

I also really want to do a bit more in-depth analysis for this as well. Ideally, I would love to take an exported clip using hardware acceleration, and compare it to a range of bitrates without hardware acceleration to figure out what the "equivalent" quality is. I suspect it will change based on how complex the footage is (movement and patterns will probably make quality with hardware acceleration worse), however, which means it will probably explode into a much larger project than it should be.

It's on the to-do list, but I honestly couldn't give you an accurate ETA for when I might be able to get around to it.

Posted on 2019-07-23 17:21:27