Version: 1.5.0 (using KDE 4.5.1)
When I open a folder with very large files (>600 Mb), and digikam starts producing the thumbnails, the program eats up all memory resources in no time, forcing everything else to swap and thus rendering the system unusable. Ultimately, the thumbnails are generated but it takes forever, and, as I said, the system is totally unusable. I am no programmer but I think there is something wrong with the thumbnail generation routine. Thanks!
Steps to Reproduce:
Put a large file in a folder (>600 Mb; mine are actually 800 Mb & 1 Gb) and go to that folder for the program to thumbnail them.
Digikam eats all memory resources, rendering the system unusable.
Digikam should manage the available memory efficiently, not causing everything else to hang, and choke on the data itself (as I said above, the process takes forever)
The same happens in current svn
Can you run digiKam under valgrind and post the console trace here.
The command line to use is this one :
valgrind --tool=memcheck --leak-check=full --error-limit=no digikam
As we dont support tiles, the only solution would be to refuse thumbnailing such a large image. Assuming the file is compressed and it's not a JPG of PGF, you will need a multiple of the file size in memory to load the image.
ok, so there is the answer, thanks. it is not a memory leak then. why did you decide against tiles? the files in question are indeed compressed tif's so thumbnailing one of those needs multiple Gb of memory. I think that in such situations it is critical to disable thumbnailing then, although this is something that is hard to come at - what about somebody who has 1 Gb of RAM (I have 4), and has files, of, say 250 Mb? This person will be in trouble as I am.
So if you do not plan to use tiles, would there be a possibility to use the embedded thumbnail (that GIMP generates, for example?). I haven't tried in a long time, but digikam has always had problems with embedding previews in tif files.
digiKam 1.6.0 is out:
Please update and check if this entry still valid.
Thanks in advance
I tried the svn version yesterday and nothing changed - I threw the biggest files I have at it (two 1Gb 16bpp files) and it took 20+ minutes to render one thumbnail, with the system totally unusable; digikam itself took up to 2.8 Gb of memory, and the whole process took all my swap space (4 Gb) and all my RAM (4 Gb)...
There is a class KMemoryInfo which we can use to provide information about system memory (Linux and Windows atm):
It was not taken into kdelibs, so we will need to copy the source.
With this code we could check before loading a huge file if enough memory is available (and btw, also adjust our cache size which has become small for megapixel images of today cameras).
SVN commit 1217348 by mwiesweg:
Check available memory before attempting large allocations.
M +2 -1 NEWS
M +35 -0 libs/dimg/loaders/dimgloader.cpp
M +5 -3 libs/dimg/loaders/dimgloader.h
WebSVN link: http://websvn.kde.org/?view=rev&revision=1217348
SVN commit 1217349 by mwiesweg:
Missing commit part
M +3 -7 dimg.cpp
M +0 -1 imagehistory/dimagehistory.cpp
WebSVN link: http://websvn.kde.org/?view=rev&revision=1217349