Summary: | When face tag scanning, digikam exhausts all memory on computer. | ||
---|---|---|---|
Product: | [Applications] digikam | Reporter: | Kristian Karl <kristian.hermann.karl> |
Component: | Faces-Engine | Assignee: | Digikam Developers <digikam-bugs-null> |
Status: | CLOSED FIXED | ||
Severity: | normal | CC: | caulier.gilles, maheshmhegade, rf-kde |
Priority: | NOR | ||
Version: | 3.4.0 | ||
Target Milestone: | --- | ||
Platform: | Compiled Sources | ||
OS: | Linux | ||
Latest Commit: | Version Fixed In: | 7.2.0 | |
Sentry Crash Report: |
Description
Kristian Karl
2013-08-07 06:57:47 UTC
Actually, it it's better to comment out line core/utilities/facemanagement/facepipeline.cpp:594 instead. Commenting out extra/libkface/libkface/recognition-opencv-lbph/lbphfacemodel.cpp:139 crashes digikam when saving face tags. diff --git a/utilities/facemanagement/facepipeline.cpp b/utilities/facemanagement/facepipeline.cpp index e9ad434..19d7af7 100644 --- a/utilities/facemanagement/facepipeline.cpp +++ b/utilities/facemanagement/facepipeline.cpp @@ -591,7 +591,7 @@ void RecognitionWorker::process(FacePipelineExtendedPackage::Ptr package) images = imageRetriever.getThumbnails(package->filePath, package->databaseFaces.toDatabaseFaceList()); } - package->recognitionResults = database.recognizeFaces(images); + //package->recognitionResults = database.recognizeFaces(images); package->processFlags |= FacePipelinePackage::ProcessedByRecognizer; emit processed(package); Strange. Here with OpenCV 2.4.5 under OSX, i pass scanning with any problem (8Gb of RAM). Same under Linux with 16Gb of RAM (Collection is 250Gb of pictures). It take a while, but i cannot any memory leak or any dysfunction I suspect a problem with your OpenCV... Gilles Caulier Mahesh, I would to know your viewpoint here... Gilles Caulier lbphfacemodel.cpp:139 is a possible problem, OpenCV API does not allow an efficient way of loading parameters here, yet the problem should appear only if a very large number of faces is loaded. Not with 20. OpenCV problem? Commenting out facepipeline.cpp:594 - Certainly will skipping the recognition solve any problem. That's not at the root of the problem. Some more data: I uncommented kDebug extra/libkface/libkface/database/trainingdb.cpp:297 and it prints 10680 histograms. Is this a realistic number of histograms to store? : digikam(9826)/KFACE KFaceIface::TrainingDB::lbphFaceModel: Adding histogram 10676 identity 19 size 65536 digikam(9826)/KFACE KFaceIface::TrainingDB::lbphFaceModel: Adding histogram 10677 identity 19 size 65536 digikam(9826)/KFACE KFaceIface::TrainingDB::lbphFaceModel: Adding histogram 10678 identity 19 size 65536 digikam(9826)/KFACE KFaceIface::TrainingDB::lbphFaceModel: Adding histogram 10679 identity 19 size 65536 digikam(9826)/KFACE KFaceIface::TrainingDB::lbphFaceModel: Adding histogram 10680 identity 19 size 65536 : The memory gets exhausted, later, when converting the the histograms to [yaml] cv::String, at: extra/libkface/libkface/recognition-opencv-lbph/lbphfacemodel.cpp:153 I compiled a debug version of latest opencv, and the amount of bytes that gets allocated to the string ends up at some 1691131415 bytes. I got the value from variable size_t len, at opencv2/core/cvstd.hpp:565 I found that the memory gets exhausted in the while loop in opencv2/core/cvstd.hpp between lines 567 and 571 #0 cv::String::String<std::_Deque_iterator<char, char&, char*> >() at /home/krikar/dev/opencv/modules/core/include/opencv2/core/cvstd.hpp:566 #1 icvClose() at /home/krikar/dev/opencv/modules/core/src/persistence.cpp:545 #2 cv::FileStorage::releaseAndGetString() at /home/krikar/dev/opencv/modules/core/src/persistence.cpp:5173 #3 KFaceIface::LBPHFaceModel::setHistograms() at /home/krikar/dev/kde/digikam/extra/libkface/libkface/recognition-opencv-lbph/lbphfacemodel.cpp:153 #4 KFaceIface::TrainingDB::lbphFaceModel() at /home/krikar/dev/kde/digikam/extra/libkface/libkface/database/trainingdb.cpp:303 #5 lbph() at /home/krikar/dev/kde/digikam/extra/libkface/libkface/recognition-opencv-lbph/opencvlbphfacerecognizer.cpp:67 #6 KFaceIface::OpenCVLBPHFaceRecognizer::recognize() at /home/krikar/dev/kde/digikam/extra/libkface/libkface/recognition-opencv-lbph/opencvlbphfacerecognizer.cpp:153 #7 KFaceIface::RecognitionDatabase::recognizeFaces() at /home/krikar/dev/kde/digikam/extra/libkface/libkface/recognitiondatabase.cpp:620 #8 KFaceIface::RecognitionDatabase::recognizeFaces() at /home/krikar/dev/kde/digikam/extra/libkface/libkface/recognitiondatabase.cpp:594 #9 Digikam::RecognitionWorker::process() at /home/krikar/dev/kde/digikam/core/utilities/facemanagement/facepipeline.cpp:594 #10 Digikam::RecognitionWorker::qt_static_metacall() at /home/krikar/dev/kde/digikam/build/core/digikam/facepipeline_p.moc:406 #11 QObject::event(QEvent*)() at /lib64/libQtCore.so.4 #12 QApplicationPrivate::notify_helper(QObject*, QEvent*)() at /lib64/libQtGui.so.4 #13 QApplication::notify(QObject*, QEvent*)() at /lib64/libQtGui.so.4 #14 KApplication::notify(QObject*, QEvent*)() at /lib64/libkdeui.so.5 #15 QCoreApplication::notifyInternal(QObject*, QEvent*)() at /lib64/libQtCore.so.4 #16 QCoreApplicationPrivate::sendPostedEvents(QObject*, int, QThreadData*)() at /lib64/libQtCore.so.4 #17 postEventSourceDispatch(_GSource*, int (*)(void*), void*)() at /lib64/libQtCore.so.4 #18 g_main_context_dispatch() at /lib64/libglib-2.0.so.0 #19 g_main_context_iterate.isra.22() at /lib64/libglib-2.0.so.0 #20 g_main_context_iteration() at /lib64/libglib-2.0.so.0 There is usually one histogram per learned (named + confirmed) face. I wouldn't expect that if there's a collection with 20 jpgs as you say - or are there 10000 faces in the rest of your collection? The building of a yaml file is dirty, but there is absolutely no other way to feed data into OpenCV for the face recognizer. In the long run, we'll need to add a hard limit on the number of learned faces. Very strange. There's only 2 or 3 faces in each of these 20 jpgs. I started digikam with a new database: "digikam --database-directory=/home/user/digikam/tmp/", I added 1 single collection, the 20 jpgs. To make sure. Will the command line above start digikam on a clean slate? Or are there still data from other databases still around, like some training db for the faces, or such? Yes, the recognition db is a separate file (it could be shared with other apps in the future) usually at ~/.kde/share/apps/libkface/database/recognition.db When I removed my /home/$USER/.kde/share/apps/libkface/database/recognition.db digikam worked as expected. Maybe the recognition.db (size ~104 MB) was corrupted in some way. I closed this ticked. Just as a note. I had the same problem. Removing recognition.db solved the problem. Rebuilding the face db was also much faster then. Not reproducible with 7.2.0 |