Summary: | MYSQL : Application crash on scanning for faces in large picture set. | ||
---|---|---|---|
Product: | [Applications] digikam | Reporter: | Evert Vorster <evorster> |
Component: | Database-Mysql | Assignee: | Digikam Developers <digikam-bugs-null> |
Status: | RESOLVED FIXED | ||
Severity: | crash | CC: | caulier.gilles, marcel.wiesweg, metzpinguin, richm+kde, swatilodha27 |
Priority: | NOR | ||
Version: | 5.0.0 | ||
Target Milestone: | --- | ||
Platform: | Compiled Sources | ||
OS: | Linux | ||
Latest Commit: | Version Fixed In: | 7.0.0 | |
Sentry Crash Report: | |||
Attachments: | Console output when running sqlite |
Description
Evert Vorster
2016-07-11 12:45:32 UTC
Created attachment 100005 [details]
Console output when running sqlite
Do you see in the console trace with SQlite this message : Error messages: "Unable to fetch row" "disk I/O error" 10 1 Sound like a hard disk problem or something like that. This message do not come from digiKam, but the sql driver certainly. Gilles Caulier And I cannot confirm this dysfunction. Sqlite and Mysql backend work fine as expected with a large set of images to scan for faces. Gilles Caulier I'll try again once I am on some different hardware, but I doubt that this is hardware related, as I am using that hard drive pretty hard, all the time, and it seems to handle it fine, with no error reports in system messages or dmesg. If I have max_allowed_packet set to 16mb in MySQL, the database seems to go away after 1600 faces have been detected. When I set it to 64mb, I was able to scan through my entire collection and it had detected about 4000 faces. The default max_allowed_packet variable is 16MB. For larger packets, it issues an ER_NET_PACKET_TOO_LARGE error and closes the connection. You may also get a "Lost connection to MySQL server during query" error. ( Select @@global.max_allowed_packet; will provide current value) You could start MySQL server with "mysql --max_allowed_packet=128M. OR in my.ini or ~/.my.cnf, change the following option: [mysqld] max_allowed_packet=128M It is safe to increase the value of this variable because the extra memory is allocated only when needed. I'm unable to reproduce this with MySQL or SQLite. "Scan collection for faces" works fine here. I think issue exists as you've fairly large collection and your max_allowed_packet variable (in MySQL server) doesn't support the same. I'm concerned that increasing the max packet size is only hiding the problem and not curing the real problem. A similar error is also reported for SQLite so that would need an equivalent "fix". Has anyone worked out what the underlying cause of the problem is? I think it would be hasty to apply a fix that papers over the issue because it will become a harder problem to solve later when even bigger datasets are involved. The fact that it occurs later with bigger buffer sizes makes me think that the code is trying add everything in one big statement rather than adding in smaller chunks. I'm sure that there will be a performance benefit in adding a number of items at a time but it would be good to limit that to a reasonable value (will require a bit of experimentation) to ensure that it does not break size limits etc. I'm concerned that increasing the max packet size is only hiding the problem and not curing the real problem. A similar error is also reported for SQLite so that would need an equivalent "fix". Has anyone worked out what the underlying cause of the problem is? I think it would be hasty to apply a fix that papers over the issue because it will become a harder problem to solve later when even bigger datasets are involved. The fact that it occurs later with bigger buffer sizes makes me think that the code is trying add everything in one big statement rather than adding in smaller chunks. I'm sure that there will be a performance benefit in adding a number of items at a time but it would be good to limit that to a reasonable value (will require a bit of experimentation) to ensure that it does not break size limits etc. I can only agree here. A professional photographer would conceivably have a much larger collection than mine. The warning message is to inform you that something is done improperly, and the workaround is to make it work, but you have to use the external MySQL to be able to change the variables. I also have the impression that the chunk size is too large. This could be because I have 1600 + images with the "Unknown" tag, as it scans through. Maybe we should set a limit of 1000 images with one tag, and then have an "Unknown(1) tag to stop the tag selection getting too big, or fetch the images that are being updated in chunks of 1000, or just what would be displayed on the screen plus about two screens either way, (no more than 100 thumbnail images.) I have noticed slowdowns in the UI when working on this. ( And a whole host of other annoyances ) The question is where Mysql database request chunk size are defined ? I don't remember a place in DK database interface where something is hardcoded. Or perhaps it's in private implementation of Qt SQL driver ? Marcel, do you have an idea for this point ? Evert, See the solution found with Mysql settings to prevent crash while long face scanning (and not only with faces management) into bug #329091. Please adjust the Mysql settings as explained in this file and try again. Gilles Caulier What's about this file using digiKam AppImage bundle 5.4.0 pre release given at this url : https://drive.google.com/drive/folders/0BzeiVr-byqt5Y0tIRWVWelRJenM Gilles Caulier This crash was fixed in bug 375317. Maik |