Puget Systems print logo
https://www.pugetsystems.com
Read this article at https://www.pugetsystems.com/guides/2122
Article Thumbnail

What H.264/H.265 Hardware Decoding is Supported in DaVinci Resolve Studio?

Written on November 24, 2021 by Matt Bach
Share:

Introduction

The Studio version of DaVinci Resolve has long been able to utilize hardware decoding of H.264 and H.256 (HEVC) media to greatly improve performance, but unfortunately, not all types of H.264 and H.265 media are supported. On top of the codec itself, both the bit depth (8-bit, 10-bit, etc.) and chroma subsampling (4:2:0, 4:2:2, 4:4:4), as well as the hardware capabilities of your system, impacts whether you will be able to utilize hardware decoding.

We were unable to find complete documentation on which H.264/5 variants have hardware decoding support in DaVinci Resolve Studio, so we decided to do our own testing to find out. If you want to test your own system, we also have instructions and media download in the Run this Test on your System (Windows) section.

If you are looking for similar information for Premiere Pro, check out our What H.264/H.265 Hardware Decoding is Supported in Premiere Pro? article.

Hardware Decoding Support in DaVinci Resolve Studio

H.264 AMD
Radeon 5000
NVIDIA
GTX 1000
NVIDIA
RTX 2000
NVIDIA
RTX 3000
Intel Quick Sync
10th Gen
Intel Quick Sync
11th Gen
Intel Quick Sync
12th Gen
8-bit 4:2:0
8-bit 4:2:2
8-bit 4:4:4
10-bit 4:2:0
10-bit 4:2:2
10-bit 4:4:4
H.265
(HEVC)
AMD
Radeon 5000
NVIDIA
GTX 1000
NVIDIA
RTX 2000
NVIDIA
RTX 3000
Intel Quick Sync
10th Gen
Intel Quick Sync
11th Gen
Intel Quick Sync
12th Gen
8-bit 4:2:0
8-bit 4:2:2
8-bit 4:4:4
10-bit 4:2:0
10-bit 4:2:2
10-bit 4:4:4
12-bit 4:2:0
12-bit 4:2:2
12-bit 4:4:4

Determining Your H.264/5 Media Type

If you are not sure what bit depth or chroma subsampling your media is, the easiest and most accurate way is to install a program called MediaInfo. Note: you will typically need to switch to the "Tree" or another detailed view to see this information.

MediaInfo bit depth and chroma subsampling

Run this Test on your System (Windows)

If you want to test your own system to see what flavors of H.264/H.265 your system is able to use hardware decoding for, you can download our test assets:

Instructions:

  1. Download and unzip the test assets using the link above
  2. Run "Trancode.bat" to generate the various flavors of H.264 and H.265
  3. Launch DaVinci Resolve and create a temporary new project
  4. Ensure hardware decoding is enabled in the preferences through "DaVinci Resolve->Preferences->Decode Options". If you change this setting, be sure to restart DaVinci Resolve
  5. Import the test clips into DaVinci Resolve
  6. Open Task Manager by right-clicking on the Windows task bar and selecting "Task Manager"
  7. Expand Task Manager by clicking "More Details", then switch to the "Performance Tab"
  8. Open a clip on the "Media" Tab for preview and hit play. You will want to have looping enabled as the clips are relatively short
  9. Check Task Manager to see if the GPU/Quick Sync is being used for decoding. Note that AMD GPUs do not report this data directly, so you have to infer whether it is being used based on the CPU load

Looking for a DaVinci Resolve Workstation?

Puget Systems offers a range of powerful and reliable systems that are tailor-made for your unique workflow.

Configure a System!

Labs Consultation Service

Our Labs team is available to provide in-depth hardware recommendations based on your workflow.

Find Out More!
Tags: hardware decoding, GPU decoding, Quick Sync, DaVinci Resolve
Avatar Asaf Blasberg

This is an incredibly important article, thank you Matt so much. What about on the Mac M1 side? Do you know?

Posted on 2021-04-16 20:21:51
Avatar Asaf Blasberg

Also what is so incredibly disappointing is that a super expensive RTX 3000 series card cannot decode 4:2:2 with acceleration but the integrated graphics from Intel can :(

Posted on 2021-04-16 20:23:05
Avatar disqus_qQoIjmB06Y

Any RTX 2000 or 3000 series Nvidia cards should be able to do that just fine. See https://en.wikipedia.org/wi...

Posted on 2021-07-04 20:24:19

Unfortunately, no NVIDIA cards support H.264/5 4:2:2. The Wiki page you linked doesn't include 4:2:2 at all, and the official NVIDIA NVDEC matrix also doesn't include it, but we have checked directly with NVIDIA and 4:2:2 is not supported at this time.

I certainly wish they would add support since 4:2:2 tends to be the biggest source of performance issues in my experience. Well, that are variable bit rate footage.

Posted on 2021-07-06 16:45:36
Avatar disqus_qQoIjmB06Y

Thanks for the reply. I'll take your word for it that you contacted Nvidia, but it seems really odd to me that 4th and 5th generation cards can decode 8-bit, 10-bit, and 12-bit 4:2:0, and 4:4:4, but not 4:2:2. Surely, if it can decode H.265 12-bit 4:4:4, then it can decode 4:2:2? Right? Here are my references: https://developer.nvidia.co... AND https://videocardz.com/newz...

I have a GTX 1070, and Davinci Resolve Studio 17 is handling my footage ok, but outside of the Davinci every other playback app can't handle my 10-bit H.265 4:2:2 footage, not even VLC. I am running an 8-Core i7 5960x at 4 Ghz, with 64GB RAM and a fast M.2 Samsung SSD on Ubuntu 20.04 LTS. So I am looking to upgrade to a video card that has the correct support and Nvidia cards have rock solid drivers for Linux.

How do AMD cards fare in this regard, do you know? Thanks.

Posted on 2021-07-06 19:43:37
Avatar Arex Li

If you edit 4:2:2 HEVC footage from EOS R5 or A7s III directly, you will see how terrible it is without native 4:2:2 HW decoding support. Still, you may use proxy as a traditional workaround.

Here, M1 Mac and even iPhone 8 are way better. Surprisingly, iPhone 8 supports native 4:2:2 HEVC HW decoding. In terms of simple editing with just trimming, LumaFusion on iPhone 8 can be much more responsive and render much faster than my Xeon E5-2699v4 with RTX 3080 Ti. Without heavy GPU effects, any M1 Mac is faster than most PC in Davinci Resolve.

Posted on 2021-12-08 02:08:17
Avatar The F

RDNA2 ?

Posted on 2021-04-18 17:07:38
Avatar Casey

Does anybody know what the AMD 6000 series can do?

Posted on 2021-09-01 01:34:39
Avatar NASSOST PHOTO

If someone has a Ryzen 5900x and a 1660Ti 6GB is it better to disable GPU decoding in Resolve?

Posted on 2021-04-19 11:50:55

In most cases, probably not. The exception may be if you use multicam timelines since sometimes that is too much for the GPU and you will get better playback with using software decoding through the CPU. Seems to depend on the footage quite a bit, however, so would require you to do a bit of experimenting to find out.

The big thing with hardware decoding isn't so much the normal playback, but things like reverse playback, 2x+ playback, and scrubbing. Those are generally terrible with software decoding, but much better with hardware decoding.

Posted on 2021-04-19 17:37:00
Avatar 1meter60

Thank you for the test. Just a quick note:
Your ZIP File with the testfile and the transcode.bat has an error, because in the transcode.bat you refer to a video file called "h265_..." but in your zip file the video file is called "h264_...".
So you can simply rename the video file to "h264..." and the transcoding works.
Now I will test my system.

Posted on 2021-04-23 18:38:51
Avatar 1meter60

My Results on my PC with an intel i7-8700k and an AMD Radeon VII GPU:
Intel Quick Sync 8th generation has the same results as the 10th generation.
AMD Radeon VII has the same results as the Radeon 5000.

Scrubbing on my PC is smoother with Quick Sync. And also the Radeon VII produces sometimes bloggy artifacts in playback or scrubbing.

Side notes: In the Resolve menu I can activate both GPUs for decoding at the same time. In this case Resolve uses always the AMD GPU.
And my AMD Radeon VII shows the video decoding work in the task manager (I use the Pro drivers if that matters).
In Hardware-Encoding the Intel GPU gives much better results in detailed images as the AMD GPU and also the Intel GPU has the option of h.265 10 bit which the AMD GPU lacks.

So I prefer the Intel internal GPU for hardware decoding and encoding over the AMD Radeon VII GPU.

Now it would be interesting to see, which hardware is better for decoding for playback and scrubbing: The new 11th Generation Intel CPUs with less CPU Cores but with Quick Sync or the AMD CPUs which have much more CPU-Cores but no GPU hardware decoding. If you can test this, it would be awesom!

Posted on 2021-04-23 19:53:42
Avatar 1meter60

Resolutions: We also have to care about the maximum resolutions, the GPUs support.
My 8th Gen. Intel internal GPU and the AMD Radeon VII decode only 10-bit 4:2:0 material with resolutions up to UHD and C4k.
The new 5.9k resolution from my Panasonic S1 camera with h.265 in 10-bit 4:2:0 cannot be hardware decoded.

Does anybody know if the 11th Gen Intel Quick Sync support resolutions up to 8k for h.265?

Posted on 2021-05-07 16:45:09
Avatar Far Middle

So 10 bit 4:2:0 is it for fastest render times?

Posted on 2021-04-29 18:00:20
Avatar Giulio Dallatorre

it results in the best scrubbing performance

Posted on 2021-07-07 14:21:53
Avatar Ampere

https://twitter.com/Blackma...
https://forum.blackmagicdes...

Support for decoding AV1 clips on Windows.
Accelerated AV1 decodes on supported Intel, NVIDIA and AMD platforms.

Finally AV1 codec support in DaVinci Resolve.

Posted on 2021-05-12 17:44:50

I saw that too! Nice to see, but I do wonder how many people will actually use it. As far as I know, there are no cameras available that record natively to AV1, although I suppose some people might use it as an intermediary codec for some reason? Or people downloading clips from Youtube to re-edit?

Hopefully this is a chicken and the egg kind of problem, where now that hardware decoding support is showing up in software, camera manufacturers will start adding options for AV1 recording. I do really like how Resolve seems to be very proactive about adding hardware decoding support whenever it is possible.

Posted on 2021-05-12 19:02:19
Avatar Joel Hazel

Tested with the new 17.2 on an AMD 3900X CPU and RTX 2060 GPU .. appears the 10 & 12bit 4:4:4 are no longer hardware decoded. CPU spiked up on both. Unless I read this test wrong.

Posted on 2021-05-15 22:24:29
Avatar phanter II

Thank you for this

Posted on 2021-05-21 12:55:29
Avatar Geoff C. Bassett

Thank you for these incredibly useful charts! Please do this for the M1 Macs as well.

Posted on 2021-05-27 12:30:41
Avatar Ampere

https://nvidianews.nvidia.c...

Matt, can you please benchmark RTX 3080 Ti and RTX 3070 Ti for content creation when you get them in your office? Thanks

Posted on 2021-06-01 05:44:50

Yep, once those cards are launched we will have articles for them (they were announced, but are not actually launched for reviews/purchase yet). Hard to say if we will have the testing done right at launch or if it will take a bit - depends on whether we are able to get our hands on pre-launch samples or not.

Posted on 2021-06-01 15:42:43
Avatar Darren George

looking forward to it

Posted on 2021-06-14 17:52:31
Avatar Darren George

Thank you thank you thank you! I searched so long to find the answer to this and I should have known you all would have been on top of things

Posted on 2021-06-03 02:06:12
Avatar Ampere

https://www.nvidia.com/down...

Nvidia Studio driver 462.65 with RTX 3080 Ti support, but Supported Products tab only lists RTX 30 series, not sure if older RTX 20 series is supported or not.

RTX 3070 Ti driver comes on June 10th.

Posted on 2021-06-03 12:41:04
Avatar Darren George

So now that we know Canon R5 can't be edited on any NVIDIA cards, are we proposing a Mac solution or transcoding?

Posted on 2021-06-14 17:53:15

If you mean H.264 4:2:2, I don't believe any Macs have hardware decoding support for that either (even the M1). The solution for that codec if it is being problematic is unfortunately either proxies or complete transcoding prior to editing.

Posted on 2021-06-14 19:00:45
Avatar Darren George

Tears! Yes that's what I am currently doing. I am also exploring using an Atomos to avoid having to transcode.

Posted on 2021-06-14 19:33:02
Avatar Lost Traveler Nick

You can do it on Rocketlake desktop,Tigerlake laptop or even the previous generation of laptop whose name I can't remember.

The M1s do have hardware decoding but the generation of QS on Tigerlake and Rocketlake has a few extra things supported.

Posted on 2021-06-20 11:16:06
Avatar Darren George

So here's the thing. I have to prioritize - to me, getting an Atomos is less costly than getting a new motherboard and CPU and RAM. Plus with the Atomos I get a better production workflow on set. So I am thinking to go with the Atomos.

Posted on 2021-06-21 14:36:28
Avatar Asaf Blasberg

Thank you Matt.

Posted on 2021-07-04 21:40:08
Avatar Giulio Dallatorre

Did somebody test if it is better to have an 8-Core intel 11Gen or a 16-Core AMD CPU in editing H.265 4:2:2 (price is no object)

Posted on 2021-07-07 14:23:33

H.265 4:2:2 is going to be significantly better with an Intel 11th Gen CPU since you can utilize Quick Sync to do hardware decoding of that footage. If you went with AMD (or even a higher-end Intel), you would have to brute force the decoding which doesn't work well with LongGOP media. Especially for things like reverse playback and scrubbing, hardware decoding makes a massive difference.

Posted on 2021-07-07 16:43:19
Avatar Anish Rajah

Was wondering if Intel's UHD 750 graphics on 11th gen Rocket Lake will be able to decode H.265 4:2:2? Or is it only Iris Xe graphics on 11th gen Tiger Lake?

Posted on 2021-09-17 03:56:19

This is covering desktop hardware, so the UHD 750 should support H.265 4:2:2. I don't think there is a difference with the mobile parts, but that isn't something we have tested.

Posted on 2021-09-17 15:30:17
Avatar georgekg

Thanks for this amazing article

Posted on 2021-10-24 16:07:20
Avatar AlderLake

https://www.intel.com/conte...

Matt, can you please update this article with Alder Lake's integrated GPU decoding capabilities?

Posted on 2021-11-04 21:06:32

We have some other testing we have to get through first, then we can spend some time verifying hardware decoding functionality. It should end up being the same as 11th Gen, but sometimes there isn't full software support in place, so we want to check it first before updating these charts.

Posted on 2021-11-04 21:31:43
Avatar AlderLake

https://www.intel.com/conte...

Note: The driver version numbering has rolled over from 100.9999 to 101.1069. This requires the use of all 7-digits instead of 4-digits for the driver build number. For more information, see Understanding the Intel Graphics Driver Version Number.

https://www.intel.com/conte...

Posted on 2021-11-11 20:59:34