Replies: 3 comments
-
Try passing the env variable |
Beta Was this translation helpful? Give feedback.
-
Nearly all FFmpeg decoders that support multi-threading turn that on by default and detect the correct number of threads. That environment variable was added in 2008 just when it was being added and needed to opt-into. Most image filters include slice-based multi-threading whether in FFmpeg libavfilter, frei0r plugins (as used by MLT), and most MLT-native filters. Sometimes, however, there might be some needing to opt-into. It really does "it depends," which pushes some burden on the app developer. Also, a lot of these (most?) lack SIMD processor instruction acceleration (parallelized data instructions). There is also frame threading in MLT through the consumer This was moved out of Issues because I do not accept general, never-ending things like "make everything faster," or "fix all of the bugs" as bugs. This is an enhancement request, but my personal interest now is not in improving CPU-based speed as most CPU-based in MLT are very limited: 8-bit and not color-managed. Another approach is to lean on FFmpeg libavfilter-based effects, many of which do support 10-bit and have decent performance. Finding those is a pain though. The next version of Shotcut I added keyword "#10bit" to the filter descriptions to search for them. This can be combined with the improved support for |
Beta Was this translation helpful? Give feedback.
-
FYI, "threads" property on the input file is redundant with the I know Kdenlive uses |
Beta Was this translation helpful? Give feedback.
-
I am currently rendering a long video that uses a lot of visual effects such as drop shadows. I created it in kdenlive but I also tried the command line (
melt
) with equivalent results. I am rendering on a PC with an 8-core CPU but no GPU.According to Task Manager and Process Hacker,
melt
is only using 12% CPU, which indicates that it uses only one of my 8 cores, even though the input file specifiesthreads="8"
. It should be possible to speed up the process a lot by parallelizing the rendering of visual effects in each frame.Note that I’m not talking about the video encoding done by ffmpeg. I fully understand that it needs to process the frames in sequential order to encode the video file. I’m talking about the visual effects rendering
melt
does before it sends the rendered frames to ffmpeg.Beta Was this translation helpful? Give feedback.
All reactions