I have a fairly large collection of photos. In this case, there are 32,000 pictures, taking up about 160Gb of disk space. Recently I have tried to scan for faces through this database, and have discovered a couple of small bugs.
This specific bug seems to be related to the size of the dataset that I am scanning through.
When I use MySQL as a backend, I get the error message "MySQL database has gone away", and when I use the internal SQlite I get the error message that is attached at roughly the same point through the dataset, at about picture 8000, (25%) with just over 1000 faces recognized.
From what I read online, this MySQL message means that either the database has timed out, (no likely since we are writing to it the whole time), or that the packet provided is too big.
From the Error message in the SQLite failure, it would appear as if that message is indeed quite excessive. I will try now with a larger message size set in MySQL, and see if I can use that as a workaround. However, as my picture collection is not THAT massive as compared to professional photographers, someone else will run into this issue sooner than later.
Created attachment 100005 [details]
Console output when running sqlite
Do you see in the console trace with SQlite this message :
Error messages: "Unable to fetch row" "disk I/O error" 10 1
Sound like a hard disk problem or something like that. This message do not come from digiKam, but the sql driver certainly.
And I cannot confirm this dysfunction. Sqlite and Mysql backend work fine as expected with a large set of images to scan for faces.
I'll try again once I am on some different hardware, but I doubt that this is hardware related, as I am using that hard drive pretty hard, all the time, and it seems to handle it fine, with no error reports in system messages or dmesg.
If I have max_allowed_packet set to 16mb in MySQL, the database seems to go away after 1600 faces have been detected. When I set it to 64mb, I was able to scan through my entire collection and it had detected about 4000 faces.
The default max_allowed_packet variable is 16MB. For larger packets, it issues an ER_NET_PACKET_TOO_LARGE error and closes the connection. You may also get a "Lost connection to MySQL server during query" error.
( Select @@global.max_allowed_packet; will provide current value)
You could start MySQL server with "mysql --max_allowed_packet=128M. OR in my.ini or ~/.my.cnf, change the following option:
It is safe to increase the value of this variable because the extra memory is allocated only when needed.
I'm unable to reproduce this with MySQL or SQLite. "Scan collection for faces" works fine here.
I think issue exists as you've fairly large collection and your max_allowed_packet variable (in MySQL server) doesn't support the same.
I'm concerned that increasing the max packet size is only hiding the problem and not curing the real problem. A similar error is also reported for SQLite so that would need an equivalent "fix".
Has anyone worked out what the underlying cause of the problem is? I think it would be hasty to apply a fix that papers over the issue because it will become a harder problem to solve later when even bigger datasets are involved.
The fact that it occurs later with bigger buffer sizes makes me think that the code is trying add everything in one big statement rather than adding in smaller chunks. I'm sure that there will be a performance benefit in adding a number of items at a time but it would be good to limit that to a reasonable value (will require a bit of experimentation) to ensure that it does not break size limits etc.
I can only agree here.
A professional photographer would conceivably have a much larger collection than mine.
The warning message is to inform you that something is done improperly, and the workaround is to make it work, but you have to use the external MySQL to be able to change the variables.
I also have the impression that the chunk size is too large. This could be because I have 1600 + images with the "Unknown" tag, as it scans through. Maybe we should set a limit of 1000 images with one tag, and then have an "Unknown(1) tag to stop the tag selection getting too big, or fetch the images that are being updated in chunks of 1000, or just what would be displayed on the screen plus about two screens either way, (no more than 100 thumbnail images.)
I have noticed slowdowns in the UI when working on this. ( And a whole host of other annoyances )
The question is where Mysql database request chunk size are defined ? I don't remember a place in DK database interface where something is hardcoded.
Or perhaps it's in private implementation of Qt SQL driver ?
Marcel, do you have an idea for this point ?
See the solution found with Mysql settings to prevent crash while long face scanning (and not only with faces management) into bug #329091.
Please adjust the Mysql settings as explained in this file and try again.
What's about this file using digiKam AppImage bundle 5.4.0 pre release given at
this url :
This crash was fixed in bug 375317.