Bug 365354 - MYSQL : Application crash on scanning for faces in large picture set.
Summary: MYSQL : Application crash on scanning for faces in large picture set.
Status: RESOLVED FIXED
Alias: None
Product: digikam
Classification: Applications
Component: Database-Mysql (show other bugs)
Version: 5.0.0
Platform: Compiled Sources Linux
: NOR crash
Target Milestone: ---
Assignee: Digikam Developers
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2016-07-11 12:45 UTC by Evert Vorster
Modified: 2020-08-12 08:17 UTC (History)
5 users (show)

See Also:
Latest Commit:
Version Fixed In: 7.0.0


Attachments
Console output when running sqlite (127.82 KB, text/plain)
2016-07-11 12:46 UTC, Evert Vorster
Details

Note You need to log in before you can comment on or make changes to this bug.
Description Evert Vorster 2016-07-11 12:45:32 UTC
I have a fairly large collection of photos. In this case, there are 32,000 pictures, taking up about 160Gb of disk space. Recently I have tried to scan for faces through this database, and have discovered a couple of small bugs. 

This specific bug seems to be related to the size of the dataset that I am scanning through. 
When I use MySQL as a backend, I get the error message "MySQL database has gone away", and when I use the internal SQlite I get the error message that is attached at roughly the same point through the dataset, at about picture 8000, (25%) with just over 1000 faces recognized. 

From what I read online, this MySQL message means that either the database has timed out, (no likely since we are writing to it the whole time), or that the packet provided is too big. 

From the Error message in the SQLite failure, it would appear as if that message is indeed quite excessive. I will try now with a larger message size set in MySQL, and see if I can use that as a workaround. However, as my picture collection is not THAT massive as compared to professional photographers, someone else will run into this issue sooner than later. 

Reproducible: Always
Comment 1 Evert Vorster 2016-07-11 12:46:26 UTC
Created attachment 100005 [details]
Console output when running sqlite
Comment 2 caulier.gilles 2016-07-12 04:12:08 UTC
Do you see in the console trace with SQlite this message :

Error messages: "Unable to fetch row" "disk I/O error" 10 1 

Sound like a hard disk problem or something like that. This message do not come from digiKam, but the sql driver certainly.

Gilles Caulier
Comment 3 caulier.gilles 2016-07-12 04:13:17 UTC
And I cannot confirm this dysfunction. Sqlite and Mysql backend work fine as expected with a large set of images to scan for faces.

Gilles Caulier
Comment 4 Evert Vorster 2016-07-13 19:35:30 UTC
I'll try again once I am on some different hardware, but I doubt that this is hardware related, as I am using that hard drive pretty hard, all the time, and it seems to handle it fine, with no error reports in system messages or dmesg. 

If I have max_allowed_packet set to 16mb in MySQL, the database seems to go away after 1600 faces have been detected. When I set it to 64mb, I was able to scan through my entire collection and it had detected about 4000 faces.
Comment 5 swatilodha27 2016-07-14 10:47:46 UTC
The default max_allowed_packet variable is 16MB. For larger packets, it issues an ER_NET_PACKET_TOO_LARGE error and closes the connection. You may also get a "Lost connection to MySQL server during query" error.
( Select @@global.max_allowed_packet; will provide current value)

You could start MySQL server with "mysql --max_allowed_packet=128M. OR in my.ini or ~/.my.cnf, change the following option:
[mysqld]
max_allowed_packet=128M

It is safe to increase the value of this variable because the extra memory is allocated only when needed.
Comment 6 swatilodha27 2016-07-14 11:15:27 UTC
I'm unable to reproduce this with MySQL or SQLite. "Scan collection for faces" works fine here.

I think issue exists as you've fairly large collection and your max_allowed_packet variable (in MySQL server) doesn't support the same.
Comment 7 Richard Mortimer 2016-07-14 11:20:02 UTC
I'm concerned that increasing the max packet size is only hiding the problem and not curing the real problem. A similar error is also reported for SQLite so that would need an equivalent "fix".

Has anyone worked out what the underlying cause of the problem is? I think it would be hasty to apply a fix that papers over the issue because it will become a harder problem to solve later when even bigger datasets are involved.

The fact that it occurs later with bigger buffer sizes makes me think that the code is trying add everything in one big statement rather than adding in smaller chunks. I'm sure that there will be a performance benefit in adding a number of items at a time but it would be good to limit that to a reasonable value (will require a bit of experimentation) to ensure that it does not break size limits etc.
Comment 8 Richard Mortimer 2016-07-14 11:28:26 UTC
I'm concerned that increasing the max packet size is only hiding the problem and not curing the real problem. A similar error is also reported for SQLite so that would need an equivalent "fix".

Has anyone worked out what the underlying cause of the problem is? I think it would be hasty to apply a fix that papers over the issue because it will become a harder problem to solve later when even bigger datasets are involved.

The fact that it occurs later with bigger buffer sizes makes me think that the code is trying add everything in one big statement rather than adding in smaller chunks. I'm sure that there will be a performance benefit in adding a number of items at a time but it would be good to limit that to a reasonable value (will require a bit of experimentation) to ensure that it does not break size limits etc.
Comment 9 Evert Vorster 2016-07-14 11:39:20 UTC
I can only agree here. 

A professional photographer would conceivably have a much larger collection than mine. 
The warning message is to inform you that something is done improperly, and the workaround is to make it work, but you have to use the external MySQL to be able to change the variables. 

I also have the impression that the chunk size is too large. This could be because I have 1600 + images with the "Unknown" tag, as it scans through. Maybe we should set a limit of 1000 images with one tag, and then have an "Unknown(1) tag to stop the tag selection getting too big, or fetch the images that are being updated in chunks of 1000, or just what would be displayed on the screen plus about two screens either way, (no more than 100 thumbnail images.)
I have noticed slowdowns in the UI when working on this. ( And a whole host of other annoyances )
Comment 10 caulier.gilles 2016-07-19 06:38:19 UTC
The question is where Mysql database request chunk size are defined ? I don't remember a place in DK database interface where something is hardcoded.

Or perhaps it's in private implementation of Qt SQL driver ?

Marcel, do you have an idea for this point ?
Comment 11 caulier.gilles 2016-08-06 13:48:26 UTC
Evert,

See the solution found with Mysql settings to prevent crash while long face scanning (and not only with faces management) into bug #329091.

Please adjust the Mysql settings as explained in this file and try again.

Gilles Caulier
Comment 12 caulier.gilles 2016-11-25 20:11:11 UTC
What's about this file using digiKam AppImage bundle 5.4.0 pre release given at
this url :

https://drive.google.com/drive/folders/0BzeiVr-byqt5Y0tIRWVWelRJenM

Gilles Caulier
Comment 13 Maik Qualmann 2019-12-14 15:29:58 UTC
This crash was fixed in bug 375317.

Maik