Author Topic: How to use gpu for decode and encode in avidemux  (Read 924 times)

Mohammad

  • Newbie
  • *
  • Posts: 8
How to use gpu for decode and encode in avidemux
« on: July 23, 2019, 07:41:28 PM »
Hello
I have a rx 580 amd
I need a full instruction in how to use my gpu as video decoder/encoder in avidemux
Thank you guys
It really has made me curious
 i have a very bad cpu. If i know how to use my gpu in encodeing it will help me a lot

eumagga0x2a

  • Moderator
  • Hero Member
  • *****
  • Posts: 3379
Re: How to use gpu for decode and encode in avidemux
« Reply #1 on: July 23, 2019, 08:40:58 PM »
Which operating system? On Windows, HW accelerated decoding for H.264 and HEVC only might be available via DXVA2 interface, just enable it in Avidemux settings (please try the latest build, a MinGW one or a VC++ one – the latter should be updated tomorrow to the current git master, the former is current – right away instead of starting with the ageing release) and see whether it works or not.

HW accelerated encoding using AMD GPU is not implemented in Avidemux and is unlikely to arrive soon as no one of Avidemux developers and contributors posess necessary hardware.

To use HW accelerated decoding and encoding right away, swap Radeon for a pretty recent NVIDIA.

Mohammad

  • Newbie
  • *
  • Posts: 8
Re: How to use gpu for decode and encode in avidemux
« Reply #2 on: July 23, 2019, 09:40:46 PM »
Which operating system? On Windows, HW accelerated decoding for H.264 and HEVC only might be available via DXVA2 interface, just enable it in Avidemux settings (please try the latest build, a MinGW one or a VC++ one – the latter should be updated tomorrow to the current git master, the former is current – right away instead of starting with the ageing release) and see whether it works or not.

HW accelerated encoding using AMD GPU is not implemented in Avidemux and is unlikely to arrive soon as no one of Avidemux developers and contributors posess necessary hardware.

To use HW accelerated decoding and encoding right away, swap Radeon for a pretty recent NVIDIA.
My operating system is win 10 and it is up to date and Im using the latest version of avidemux.
You Mean there is no way to use amd gpu in encoding/decoding in avidemux?
there is a check box in HW Accel in preferences which it is - decode Video using XVBA (AMD) - but it is locked or off and it cannot be cheked
How can I use that XVBA in HW Accel?
Does it require some sort of driver to turn on ?
thank you for your help.

eumagga0x2a

  • Moderator
  • Hero Member
  • *****
  • Posts: 3379
Re: How to use gpu for decode and encode in avidemux
« Reply #3 on: July 23, 2019, 11:00:52 PM »
1. Please use the latest nightly as pointed out above, the last release is pretty outdated. Confusing dead HW accel entries have been removed from the preferences since then.

2. You should be able to use HW accelerated decoding. Enable DXVA2 and check whether DXVA2 is displayed as decoder in Avidemux GUI when you play a H.264 video. I don't have AMD graphics card to try it myself, so please try and report back.

3. HW accelerated encoding on ADM GPU should be in principle feasible as ffmpeg now supports it, if someone with AMD hardware and appropriate skills contributes code to implement it.

Mohammad

  • Newbie
  • *
  • Posts: 8
Re: How to use gpu for decode and encode in avidemux
« Reply #4 on: July 23, 2019, 11:33:40 PM »
1. Please use the latest nightly as pointed out above, the last release is pretty outdated. Confusing dead HW accel entries have been removed from the preferences since then.

2. You should be able to use HW accelerated decoding. Enable DXVA2 and check whether DXVA2 is displayed as decoder in Avidemux GUI when you play a H.264 video. I don't have AMD graphics card to try it myself, so please try and report back.

3. HW accelerated encoding on ADM GPU should be in principle feasible as ffmpeg now supports it, if someone with AMD hardware and appropriate skills contributes code to implement it.
1-You mean there is no way to use that HW Accel entery?
2 I check DXVA2 and when i opened a hevc video it was shown under video decoder in gui but i can not configure it and when i decode a video it still use my goddamn cpu
« Last Edit: July 23, 2019, 11:36:38 PM by Mohammad »

eumagga0x2a

  • Moderator
  • Hero Member
  • *****
  • Posts: 3379
Re: How to use gpu for decode and encode in avidemux
« Reply #5 on: July 23, 2019, 11:45:50 PM »
1-You mean there is no way to use HW Accel entery?

Which one? XVBA? As already explained, it is a) dead b) Linux-only and c) removed from the current Avidemux GUI (and also not compiled on Linux anyway).

Quote
2 I check DXVA2 and it is shown under video decoder in gui but i can not configure it and when i decode a video it still use my goddamn cpu

What are you going to configure? If you turned it on in the preferences and it is show at the leftmost position in the main window (and please enable DXVA2 display too!), it is fine. CPU load will not drop to zero as it has to download all the decoded video data from the decoder unit in the GPU and then reupload them to the display unit. A fully contained HW accel where decoded data don't have to leave the graphics card is not yet implemented for DXVA2, but high on the todo list.

Mohammad

  • Newbie
  • *
  • Posts: 8
Re: How to use gpu for decode and encode in avidemux
« Reply #6 on: July 24, 2019, 12:06:57 AM »
It is something wrong with it
The gpu video decodation usage percent (which is shown in win 10 task manager) is only 2 percent during decodation but my cpu usage is 99 percent
And the importand thing is that my gpu can decode 4k x265 30fps video with 3mg bitrate/persec during 4k gaming without any lag which it mean it is strong in encoding/decoding
But when i start a video decoding (exactly as setting you said and dxva2 was bieng shown where you said too) only 5 frame persec was being decoded and it was very very slow and as i said cpu usage was 99 percent whereas gpu decodation usage was only 2
« Last Edit: July 24, 2019, 12:12:55 AM by Mohammad »

eumagga0x2a

  • Moderator
  • Hero Member
  • *****
  • Posts: 3379
Re: How to use gpu for decode and encode in avidemux
« Reply #7 on: July 24, 2019, 12:23:18 AM »
Please restart Avidemux, load a video, play it for a second or two, close Avidemux and compress (zip or 7zip) and attach admlog.txt from %localappdata%\avidemux\

(%localappdata% is so called environmental variable, enter it into the location bar of Windows Explorer to go to the directory it points at)

dosdan

  • Full Member
  • ***
  • Posts: 157
Re: How to use gpu for decode and encode in avidemux
« Reply #8 on: July 24, 2019, 12:24:01 AM »
I've got an old I5 and a fairly humble GT640 vidio card. Here's DVA2 in operation as I play (Copy/Copy) an MKV (not much CPU use) with VP9+Opus using ADM 190630 VC++:



When I transcode to AVC+AAC CPU goes to 99% and GPU usage for ADM is empty.

I also tried to encode to Nvidia H264 (I know the quality of this is lower than x264 , but it told me that the bitrate was possibly too low (even with a high BR selected).

Dan.
« Last Edit: July 24, 2019, 12:32:15 AM by dosdan »

eumagga0x2a

  • Moderator
  • Hero Member
  • *****
  • Posts: 3379
Re: How to use gpu for decode and encode in avidemux
« Reply #9 on: July 24, 2019, 12:26:35 AM »
DXVA2 is enabled only for H.264 and HEVC (requires a very recent version of Intel graphics driver when using Intel GPU, should be fine on Nvidia).

Mohammad

  • Newbie
  • *
  • Posts: 8
Re: How to use gpu for decode and encode in avidemux
« Reply #10 on: July 24, 2019, 12:29:58 AM »
Thank you for your help
Buy the way
if found out a way to use amd in decoding inform me
It'll be really helpfull
Thanks again

Mohammad

  • Newbie
  • *
  • Posts: 8
Re: How to use gpu for decode and encode in avidemux
« Reply #11 on: July 24, 2019, 12:32:58 AM »
DXVA2 is enabled only for H.264 and HEVC (requires a very recent version of Intel graphics driver when using Intel GPU, should be fine on Nvidia).
Yes
I decode in both hevc and x264 but nothing considerable happened

eumagga0x2a

  • Moderator
  • Hero Member
  • *****
  • Posts: 3379
Re: How to use gpu for decode and encode in avidemux
« Reply #12 on: July 24, 2019, 06:36:53 PM »
My comment was related to dosdan's reply.

Buy the way
if found out a way to use amd in decoding inform me

This is very unlikely to happen on my part as I don't plan to install an AMD graphics card into my refreshed PC, due when the support for Windows 7 ends. Thus the only recommendation I can give you if you want to be able both to decode and encode H.264 and HEVC in hardware is to replace AMD with NVIDIA (but NOT GeForce GT 1030 as the GP108 chip doesn't include an encoder unit).

In any case you should be aware of very poor compression level with HW encoders in comparison to x264 and x265 software encoders. The speed is great, but decent quality requires very high bitrates.

Mohammad

  • Newbie
  • *
  • Posts: 8
Re: How to use gpu for decode and encode in avidemux
« Reply #13 on: July 24, 2019, 07:54:05 PM »
My comment was related to dosdan's reply.

Buy the way
if found out a way to use amd in decoding inform me

This is very unlikely to happen on my part as I don't plan to install an AMD graphics card into my refreshed PC, due when the support for Windows 7 ends. Thus the only recommendation I can give you if you want to be able both to decode and encode H.264 and HEVC in hardware is to replace AMD with NVIDIA (but NOT GeForce GT 1030 as the GP108 chip doesn't include an encoder unit).

In any case you should be aware of very poor compression level with HW encoders in comparison to x264 and x265 software encoders. The speed is great, but decent quality requires very high bitrates.
As the person which has used both nvidia and amd gpu
I tell you amd is better
Really better
It gives you a flixible driver which you can overclock your gpu the best and not only new graphic cards but also old ones.
You will understand the variety of it settings if you see amd driver yourself
You can control your graphic card as much as possible
You can even fully control your graphic card fan (as you know none of gpu fans reach more than 85 percent of their speed)
At last with amd driver I reached 13.5 percent more frame persec in Furmark
Rx 580 is not a highend graphic but it can record 4k video smoothly with no problem and its 4k gaming is fantastic too.
The only problem i have with this graphic card is that I dont know how to use its decode/encode feature properly.
it has to be mentioned it is the best graphic card in its price. it is better than 1060 6gb
i wrote all of this above just to inform you that amd is not as bad as you think.
Thank you for your help again. it helps me a lot.