VDPAU hardware decoding with AMD

Started by betters, January 19, 2023, 05:12:10 PM

Previous topic - Next topic


I have been working on this topic for some time, firstly on a Ryzen 3200g and now on a 5600g (both with integrated gpus (VEGA)). I went down many dead ends, mostly because the official word about it working was that it was a lot of work and little developer interest. Anyway, you will be pleased to know that I have been able to get hardware decoding (crucially,for me at least) working both for display and for accelerated decoding during the encoding (save) process. This is on Linux Mint 21.0, I have no acccess to Windows.
I am kind of disappointed by the official line on amd gpus, because it proves to be very simple in the end - at least for the two apus that I have tried. I am very pleased with the results. On the 3200g, with software decoding and encoding, I was lucky to get 2X encoding speeds, often less. With the acceleration, I got around 3X encoding. You can see it happening from the fps figures that Avidemux shows. On top of that (again for me at least) the pictur quality is improved significantly.
Obviusly, I can't speak for anyone with a real AMD GPU, but I don't see why they would not work in theory because what I ahve done is not specific to the 2 APUs I used. In fact, I was more wondering 'will these drivers support these relatively recent models?'.

So, what do you do? You check if you have the Mesa drivers installed. Mine installed to /usr/lib/x86-64-linux-gnu/vdpau (you get other drivers but it is the vdpau drivers we want as they are as specific for the purpose as the NVIDIA vdpau drivers are. Have a look in the vdpau folder and see if there is a specific driver for your GPU. For my purposes the latest radeonsi driver is the one (as it is for most recent GPUs). That was not too hard was it? You will be pleased to know that all you then have to do is add one line to your 'environment'. I did this in my .bashrc script as I am a sole user of the PC. The line is (minus the quotes) 'VDPAU_DRIVER=radeonsi'. Then you will have the variable set every time you start Avidemux (because it starts via a bash script itself).

That should be all you neeed to do. All it took was finding out how VDPAU works.

One very important thing though is that this may not work on an app image (which seem to be far more popular than they should be). That is because they are a self-contained environment of their own, with libraries not in the usual places. It might work but I just don't know. Your environment variable has to find radeonsi in /usr/lib/x86-64-linux-gnu/vdpau because that is where VDPAU looks. So, you have to download the sources and build Avidemux locally. It sounds tough but it really is not.

I hope this help a few thousand of you!



Just a small (but important) point to my previous post. The decoding had always worked with the 3200g but with the 5600g it was hit and miss. I finally found the reason and I could kick myself. I noticed in some notes in my .bashrc file that I had used the rx600 vdpau driver. Like a fool, I tried the radeonsi driver as being more likely to support the newer gpu. I switched back to rx600 2 days ago and have done over 10 encodes with acceleration with zero problems. In fact, I have done 4 today of around 7 hours each at 6.7X rate.
I switched to using Quicksync for the display as I thought that might eliminate VDPAU acceleration from all but encoding. I do not know if this matters as I don't feel likke changing it back now it is working.
Final point (may be related to using Quicksync) is that I needed to put the VDPAU_DRIVER=rx600 line in the /etc/environment file (globally avialable).