Avidemux Forum

Avidemux => Main version 2.6 => Topic started by: Quaternions on August 16, 2019, 11:14:02 AM

Title: Multiple frame blending
Post by: Quaternions on August 16, 2019, 11:14:02 AM
Hello, I would like to combine 100 sequential frames at a time of a slow motion video of a video game into 1 frame to create a high quality motion blur effect.  I was surprised that I couldn't find an ffmpeg filter to do the job.  Can Avidemux be used to do this?  I spent all day yesterday and achieved this effect in this video https://www.youtube.com/watch?v=6BItW_bPSkI (https://www.youtube.com/watch?v=6BItW_bPSkI) by exporting all the frames as images and adding them together with Mathematica, which is supposed to be a math software...  I know the process but I can't find software which can do it.  I use avidemux to step through video frames and clip videos, but I didn't find a filter that could do this.

Thanks!
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 16, 2019, 04:22:13 PM
I would like to combine 100 sequential frames at a time of a slow motion video of a video game into 1 frame to create a high quality motion blur effect.  I was surprised that I couldn't find an ffmpeg filter to do the job.  Can Avidemux be used to do this?

If someone writes the video filter plugin to accomplish this task and the PC has quite a lot of memory (about 12 MiB per frame in 4:2:0 and 8 bit color depth * 100 just to hold uncompressed picture data at 2160p), Avidemux should be able to do this.

If C++ is not a complete stranger to you, you could implement a filter as a subclass of ADM_coreVideoFilterCached similar to FadeTo (https://github.com/mean00/avidemux2/blob/master/avidemux_plugins/ADM_videoFilters6/fadeTo/ADM_vidFadeTo.cpp) filter, though the largest cache currently requested by a video filter constructor in Avidemux contains just 11 pictures. The job is done by the respective implementation of the getNextFrame() method.
Title: Re: Multiple frame blending
Post by: Quaternions on August 17, 2019, 07:13:25 AM
I'm not terribly familiar with C++, but I wrote some code using yours and other plugins as reference:
https://github.com/krakow10/Avidemux-FrameBlend/blob/master/ADM_vidBlendFrames.cpp (https://github.com/krakow10/Avidemux-FrameBlend/blob/master/ADM_vidBlendFrames.cpp)

The code is designed to use a high bit depth buffer frame and sum color data into it each input frame, which would avoid storing all the frames in memory, but requires a uint32_t type ADMImage which I didn't know how to go about creating.  I haven't the faintest clue how to compile it or add it to my own version.  I should probably also write a version that just fills the memory with an array of ADMImage and sums them at the end, but I don't remember how to make a custom sized list of pointers, or how to specify that type of value in the AVDM_BlendFrames class.

Also I did not figure out what this part means:
ADM_coreVideoFilterCached(3,in,setup)
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 10:14:14 AM
I'm not terribly familiar with C++, but I wrote some code using yours and other plugins as reference:
https://github.com/krakow10/Avidemux-FrameBlend/blob/master/ADM_vidBlendFrames.cpp (https://github.com/krakow10/Avidemux-FrameBlend/blob/master/ADM_vidBlendFrames.cpp)

Great start!

Quote
I haven't the faintest clue how to compile it or add it to my own version.

Do you have already a development environment for Avidemux, i.e. preferably a Linux installation (Fedora or Ubuntu, a VM would suffice) as any other platform adds a moderate (macOS) or huge (Windows 10) burden of additional complexity?

Quote
Also I did not figure out what this part means:
ADM_coreVideoFilterCached(3,in,setup)

https://github.com/mean00/avidemux2/blob/d48b5004a1e20ad653e4562de783761158c63192/avidemux_core/ADM_coreVideoFilter/include/ADM_coreVideoFilter.h#L89

= create a video filter with cache for 3 pictures, the parent filter (where the source image comes from) pointed to by in and configuration pointed to by setup.
Title: Re: Multiple frame blending
Post by: Quaternions on August 17, 2019, 02:38:02 PM
I realized that I haven't done any PTS editing for the frame times, I'm going to try to figure that out now.  I've got a ubuntu server which I've compiled some golang code on before that I mainly use for Plex.  Which implementation do you think is best for Avidemux: accumulating color in a high bit depth buffer, or loading the frames into memory?  In the first case I don't know how to edit the ADMImage class to support uint32_t (or if it can do higher bit depths already), and for the latter I don't know how to create a spot to hold an arbitrary number of frames.  I think that if this were to become a polished plugin, the frame blending and time scaling should be controllable separately, and perhaps also be extended to fractional blending to widen the applications.  I just want to make the high quality motion blur effect that I imagined easier for me to apply in practice.

Edit: Perhaps I could hardcode my specific 100 frame need?  Specify ADM_coreVideoFilterCached(100,in,setup) and then use vidCache->getImage on fn%100==100 frames?  Maybe that's what you were thinking originally and I just now understood
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 03:08:06 PM
I guess a simple solution without any PTS / FPS modifications based on a filter cache to hold pictures would be more useful. The drawback: we have to request the full size of the cache at filter creation, however.

Addition of luminance and chrominance values could be done using a lookup table like in the fadeTo filter.

To reduce FPS, the changeFps filter can be added to the chain. If you prefer an all-in-one solution, that filter could serve as reference for PTS handling.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 03:12:51 PM
Edit: Perhaps I could hardcode my specific 100 frame need?  Specify ADM_coreVideoFilterCached(100,in,setup) and then use vidCache->getImage on fn%100==100 frames?  Maybe that's what you were thinking originally and I just now understood

Yes, sort of. You have to  :)

As we already have that cache with all frames in the memory, let's use them. For a true high quality motion blur effect, the filter should be able to detect scene changes (no motion blur on these!), so don't aim too high right now.

edit: This scene changes stuff should be probably of no concern at this phase. Making the filter partializable would allow the user to specify when exactly the scene change happens (as long as there are not too many of them). But for this, any timing modifications would make it incredibly hard or rather impossible to find out the exact range when the filter should start and when it should end.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 03:15:49 PM
The last remark for now: currently you don't output any picture at all (ADMImage *image remains unset).

Quote
In the first case I don't know how to edit the ADMImage class to support uint32_t (or if it can do higher bit depths already)

I don't think this would be viable. Maybe just creating a three-dimensional array of uint32_t elements as you probably need just a buffer without all the methods ADMImage provides.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 04:12:28 PM
Making the filter partializable would allow the user to specify when exactly the scene change happens

My mistake, video filters using cache can't be made partial, I'm sorry.
Title: Re: Multiple frame blending
Post by: Quaternions on August 17, 2019, 04:51:39 PM
I wrote a uint32_t buffer but I don't think it will get deleted nicely in the ~AVDM_BlendFrames function.  The way I've implemented it probably doesn't need to extend ADM_coreVideoFilterCached anymore does it?  What should my first line in the getNextFrame function be?

Edit: I think I get it... pushing attempted non cached code

This is how I think scene changes should work:
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 07:22:40 PM
Edit: I think I get it... pushing attempted non cached code

Yes, sure. I had a different approach in mind, that is why I recommended caching.

You would probably need to call previousFilter->getNextFrame() in a loop until you have accumulated N pictures or the call has returned false in which case it might make sense to output whatever is in the buffer.

Instead of 3D array it seems to me now that it would be much handier to allocate just one chunk of memory stride*height*3 in size and then delete [] it in the dtor.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 07:35:21 PM
I've got a ubuntu server which I've compiled some golang code on before that I mainly use for Plex.

Avidemux won't run headless, so you better install Fedora Workstation or the regular desktop Ubuntu 19.04 either on bare metal (will greatly improve performance and comfort unless you've got some really nasty, poorly supported hardware) or in a VM. Avidemux uses CMake, so your filter should be added to https://github.com/mean00/avidemux2/blob/master/avidemux_plugins/ADM_videoFilters6/CMakeLists.txt and  CMakeLists.txt should be created in the top directory of the new filter. You should also run https://github.com/mean00/avidemux2/blob/master/cmake/admSerialization.py from within the filter directory to generate blend.h and blend_desc.cpp included in the filter source.
Title: Re: Multiple frame blending
Post by: Quaternions on August 17, 2019, 09:04:14 PM
Ubuntu server meaning Ubuntu desktop that I use as a Plex server :P

I cloned the repo and put my code into
Code: [Select]
/home/quat/Documents/avidemux2/avidemux_plugins/ADM_videoFilters6/blend and ran the python script on blend.conf.  A friend of mine helped me with the first cmake error for which I had to install nasm, but he's not here to help me with this one: https://hastebin.com/uxogerilop.txt (https://hastebin.com/uxogerilop.txt)

I also changed the buffer to use three flat arrays.

Does return false in getNextFrame signal no more frames available?  That would make sense to loop if that's the case.

Also made another short clip with Mathematica and ffmpeg https://www.youtube.com/watch?v=iLiWxcv_Q3w (https://www.youtube.com/watch?v=iLiWxcv_Q3w)
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 09:06:01 PM
CleanTalk going wild, as usual. Please try from a private browser window or from a different IP address, when possible.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 09:34:55 PM
I cloned the repo and put my code into
Code: [Select]
/home/quat/Documents/avidemux2/avidemux_plugins/ADM_videoFilters6/blend and ran the python script on blend.conf.  A friend of mine helped me with the first cmake error for which I had to install nasm, but he's not here to help me with this one: https://hastebin.com/uxogerilop.txt (https://hastebin.com/uxogerilop.txt)

Have you installed all deps by running

Code: [Select]
bash createDebFromSourceUbuntu.bash --deps-only?

You must also build (just build, not install to the prefix) Avidemux prior to trying to build your filter.

Quote
Does return false in getNextFrame signal no more frames available?

Yes.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 09:42:08 PM
Also made another short clip with Mathematica and ffmpeg https://www.youtube.com/watch?v=iLiWxcv_Q3w (https://www.youtube.com/watch?v=iLiWxcv_Q3w)

It would be interesting to compare this clip with the original video simply downsampled to e.g. 25 60 fps without motion blur.
Title: Re: Multiple frame blending
Post by: Quaternions on August 17, 2019, 10:35:16 PM
I had not installed the dependencies, but the error is still the same.  Running cmake avidemux_core yet again, farther up in the console it says
Code: [Select]
-- Checking for sqlite
-- *******************
-- FATAL_ERRORCould not find SQLite

I tried sudo apt-get install sqlite, which installed something, but no change
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 10:47:48 PM
In the avidemux2 directory:

Code: [Select]
bash createDebFromSourceUbuntu.bash --deps-only
bash bootStrap.bash
Title: Re: Multiple frame blending
Post by: Quaternions on August 17, 2019, 11:00:20 PM
The bootstrap command is running

The poor little quad core i5 3rd gen is pinned...
In the meantime here is some older comparison clips:
1:1
https://www.youtube.com/watch?v=lpvh4nTshes (https://www.youtube.com/watch?v=lpvh4nTshes)
1:4
https://www.youtube.com/watch?v=hZCXNVUnMFE (https://www.youtube.com/watch?v=hZCXNVUnMFE)
1:10
https://www.youtube.com/watch?v=wYKrt1U9BB8 (https://www.youtube.com/watch?v=wYKrt1U9BB8)

And the one you requested processed in time to post it too:
https://www.youtube.com/watch?v=g87Mw6Ure7I (https://www.youtube.com/watch?v=g87Mw6Ure7I)

Hmm I put FrameBlend into the CMakeLists when the file is BlendFrames... time to run it again
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 11:05:27 PM
Once Avidemux (especially core) has been built, you can use

Code: [Select]
bash bootStrap.bash --rebuild --without-core (optionally also "--without-qt --without-cli") to speedup the build and to skip already installed (into avidemux2/install) components.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 11:13:20 PM
In the meantime here is some older comparison clips

The 1x4 Sevaii Nougat clip looks the best for me.
Title: Re: Multiple frame blending
Post by: Quaternions on August 17, 2019, 11:18:50 PM
Yess!!!!! I am getting errors for my code!  Now I can start debugging

I thought the x10 version looked slick, but I was underwhelmed when I made the first 1:100 version that I linked in my original post.  I am hoping that it was just because the player was not particularly fast, and that when I use the effect on other runs they will look better.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 17, 2019, 11:29:22 PM
I wonder if giving the oldest pictures less weight (i.e. not just add the values but add values multiplied with a factor) would result in a more natural motion blur effect.
Title: Re: Multiple frame blending
Post by: Quaternions on August 17, 2019, 11:51:25 PM
Wow, it compiled.  I didn't expect to reach this point.  Can't wait to see what sort of runtime errors will pop up!

Equal weights is more or less how objects emitting or reflecting light behave in real life, so equal weights should be the most natural look.  Custom weights might make a cool stylistic effect though.

How do I run Avidemux on Ubuntu?

Edit: did ./install/usr/bin/avidemux3_qt5 and it says
Code: [Select]
/install/usr/bin/avidemux3_qt5: error while loading shared libraries: libADM_coreVideoCodec6.so: cannot open shared object file: No such file or directory
Edit: GOT IT!!! Had to move to install directory.  Now to test...
Title: Re: Multiple frame blending
Post by: Quaternions on August 18, 2019, 12:19:05 AM
THIS IS AMAZING!!!! I HAVE NEVER DONE ANYTHING LIKE THIS!!!!
Title: Re: Multiple frame blending
Post by: Quaternions on August 18, 2019, 01:20:50 AM
I made it not green and fixed the reported duration, this is a fully fledged working plugin!
I would like to use this on windows, but I doubt I would be able to compile it as easy.  What are the chances of this plugin making it into a windows build near me?

I recorded this in 120fps a long time ago, perfect test material:
https://www.youtube.com/watch?v=oD-BWiBQV3Q (https://www.youtube.com/watch?v=oD-BWiBQV3Q)
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 18, 2019, 07:11:41 AM
Congratulations!

How do I run Avidemux on Ubuntu?

If you don't want to install it (for system-wide install without proper packaging I'd recommend to specify /usr/local as --prefix during bootStrap.bash run, would require a full build, however), you can place run_avidemux_template.sh in a directory included in $PATH, rename it for convenience, edit it to match the location of the avidemux2 folder and make this script executable.

For instructions how to cross-compile Avidemux for Windows, please see cross-compiling.txt in the avidemux2 directory. It could take a couple of hours for all the components of MXE to compile before you will be able to run bootStrapCrossMingwQt5_mxe.sh.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 18, 2019, 07:31:28 AM
fixed the reported duration

You should take care of all members of FilterInfo struct you modify, in this case of frameIncrement too.
Title: Re: Multiple frame blending
Post by: Quaternions on August 18, 2019, 12:43:11 PM
The FrameIncrement remains the same.  I am going to install ubuntu onto my 7700k projector PC to attempt to compile for windows and not take a day.  Even if I don't manage to compile it for windows, I can still use it for using the plugin.  Do you think that I would be able to send the plugin dll to people after it's compiled for windows?  Or is it all hopelessly intertwined
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 18, 2019, 01:05:09 PM
The FrameIncrement remains the same.

Oh, my bad, I see now. I still unconciously fall back to my original thought of a motion blur filter which doesn't alter the perceived speed of the video.

Quote
I am going to install ubuntu onto my 7700k projector PC to attempt to compile for windows and not take a day.

Just the initial setup of the environment is time-consuming, the cross-build itself is not or not significantly slower than a native one.

Quote
Even if I don't manage to compile it for windows, I can still use it for using the plugin.  Do you think that I would be able to send the plugin dll to people after it's compiled for windows?

For those who run MinGW compiled Avidemux builds provided e.g. via https://avidemux.org/nightly/win64/ (https://avidemux.org/nightly/win64/) and as long as the source remains available or is included with the dll, why not?

But why not submitting the new filter upstream?
Title: Re: Multiple frame blending
Post by: Quaternions on August 18, 2019, 02:25:20 PM
nvEncodeAPI.h is no longer available in ffmpeg at the location pointed to by cross-compiling.txt, do I need to find an updated one or is the one that I downloaded from ffmpeg version 3.4 fine to use?

But why not submitting the new filter upstream?
What does this mean?  If it means submitting the filter to the main project, I don't know how to do that!
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 18, 2019, 02:52:14 PM
nvEncodeAPI.h is no longer available in ffmpeg at the location pointed to by cross-compiling.txt

I forgot to update the instructions, this and other headers are now provided by https://github.com/FFmpeg/nv-codec-headers

Quote
do I need to find an updated one or is the one that I downloaded from ffmpeg version 3.4 fine to use?

If you have a NVIDIA graphics card capable of encoding H.264 and maybe even HEVC and want to be able to use this capability from Avidemux, then you need a recent version for use with recent NVIDIA graphics drivers. Otherwise you don't need the header at all.

Quote
But why not submitting the new filter upstream?
What does this mean?  If it means submitting the filter to the main project, I don't know how to do that!

Clone the repository (already done) Actually, create a clone of Avidemux repository fork mean00's Avidemux repository to your GitHub account

https://help.github.com/en/articles/fork-a-repo

clone this repository, configure git to know about you (name, email address etc.) and the upstream repository, create a new branch, add the files belonging to the filter and other files you have modified, write a commit message, push the branch to your GitHub account, submit pull request from there.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 18, 2019, 03:15:40 PM
Regarding nv-codec-headers, installation of these headers in MXE amounts to just removing @@PREFIX@@ (as the prefix is empty) from ffnvcodec.pc.in, stripping ".in" from the filename, copying the edited file into the usr/x86_64-w64-mingw32.shared/lib/pkgconfig subdirectory of MXE for 64 bits builds and copying the folder "ffnvcodec" to usr/x86_64-w64-mingw32.shared/include/
Title: Re: Multiple frame blending
Post by: Quaternions on August 18, 2019, 04:40:32 PM
I can't seem to get the NVENC to be included.  I added the updated file to both the i686 and x86_64 .shared include directories, and did mkdir out and chmod 775 out in both .shared directories.  I would like to use this feature.  Thanks for your explanation of nv-codec-headers, I've got it working now!

I wasn't able to copy just the libADM_vf_blend.dll to the installed version, but I didn't expect that to work anyways

Also I modified the max bitrate the GUI allows in https://github.com/mean00/avidemux2/blob/master/avidemux_plugins/ADM_videoEncoder/ffNvEncHEVC/ADM_ffNvEnc.cpp#L205 (https://github.com/mean00/avidemux2/blob/master/avidemux_plugins/ADM_videoEncoder/ffNvEncHEVC/ADM_ffNvEnc.cpp#L205) from 50000 to 400000, 50000 is too small :D
I use the slow preset for NVENC on ffmpeg, I would love to attempt to add some more options to the config

Also also I forked the repo and will pull request my changes when I'm happy with my plugin.
Title: Re: Multiple frame blending
Post by: eumagga0x2a on August 18, 2019, 05:01:25 PM
(edit: Misunderstanding removed)

You can't mix output of different compilers, but if the installed version was built with MinGW, I would expect the binary of your plugin to work. Obviously, it can't work with a VC++ build like the official release version.

Quote
I use the slow preset for NVENC on ffmpeg, I would love to attempt to add some more options to the config

Also also I forked the repo and will pull request my changes when I'm happy with my plugin.

Great to hear that!
Title: Re: Multiple frame blending
Post by: Quaternions on August 18, 2019, 10:54:46 PM
I've uploaded several clips, which I've put together in a playlist:
https://www.youtube.com/playlist?list=PLEQnPdZwMzNdguqJfLl9bUaRdPiQKK8CL (https://www.youtube.com/playlist?list=PLEQnPdZwMzNdguqJfLl9bUaRdPiQKK8CL)

Do I submit the pull request to master?Very cool, this is the first project that I have contributed to on github!