When using standalone G'mic from command line the preview (and final processing) of colorization works much faster than when I work on the same image in Krita (integrated g'mic). For stand alone G'mic I'm using comand: gmic input.png -x_colorize 1,1024,1 -split c,2 -o[-1] output.png input.png is here: http://sta.sh/03evnbtvq9p I'm aware that Krita can do a lot of more processing around the actual colorization. However, I thought it would be a good idea to let you guys know about this just in case. Reproducible: Always Steps to Reproduce: 1. Colorize image in Krita (G'mic interactive colorization) 2. Colorize image in stand-alone G'mic 3. Compare performance and times Actual Results: On my hardware using G'mic integrated with Krita colorizing an image works 2-3 times slower Expected Results: Not sure if it's possible to achieve the same time. Please consider this only as an info. CPU: Intel i5-3210M @2.50 GHz Memory: 8 GB System: Windows 7 64b GPU: NVIDIA GeForce GT 630M Tablet: Wacom Bamboo
Hi, Thanks for your report. I think that this is related to https://bugs.kde.org/show_bug.cgi?id=359030 so I'll add this bug report to that one. *** This bug has been marked as a duplicate of bug 359030 ***
As a side note, let me say that the main colorization algorithm is not multi-threaded, so the difference cannot be due to the fact OpenMP is enabled in the CLI version of G'MIC. I suspect this has to do with the different compilers used (G'MIC CLI uses g++, while Krita uses MSVC). I has observed the same kind of problems when compiling G'MIC CLI using MSVC : it is way slower to compile, but has also slower performances when executing filters.
Okay, that explains the issue.