Version: (using Devel) OS: Linux Installed from: Compiled sources The information in the digikam database may represent a lot of hard work - tagging, commenting and rating photos. There is always the possibility that some problem, not necessarily within digikam - hardware, software elsewhere, inadvertent user action - can corrupt the database and wipe out all of this work. For example, copying the database file (e.g. backup/restore, moving home directories) without the appropriate options to handle a sparse file can make it unusable, also bug 222774 describes tags being lost for some unknown reason. There needs to be an option to dump at least the user-entered data into some readable and archiveable format - plain text or XML - and restore it again. (Image metadata is not so important because it can always be re-read from the images). There needs to be an option to reference images by pathname/filename rather than their hash, so that changes in album root or relative path can be accomodated if necessary.
This will the next main component to do in digiKam for the future... But before, we must port digiKam from Sqlite to MySql. This is under progress and very well advanced to a dedicated devel branch. Gilles Caulier
Do you know any defined XML format for such purposes?
Why not to use a binary form based on DB backend (typically MySql) Gilles
Preferably not a binary format - the whole point of having a plain text or XML file is that, if necessary, it can be manipulated by editing or via a script in order to fix up any incompatibilities.
*** Bug 187055 has been marked as a duplicate of this bug. ***
This is a good news that you plan to add such a feature to Digikam. Thank you and keep the great job :)
Hi, I am recent user of digikam, and I have been thinking over backups for a while. I would like to comment my ideas about a couple of user cases, and other things that could be done related with backups. Is this the right place ? Rafa Rios
It's it is of course... Gilles Caulier
Well, here I go, (you warned this is a long post writed and english is not my mother language :) ) User cases Case (1): In my case, I have 2 USB HDDs, one at home (HDD_A) and the other in my parents home (HDD_2). HDD_2 is the backup. In the HDDs I have the pictures and the digikam's databases. I always download pictures from the camera to HDD_A. I would like to have them synchronized, by now, from time to time, I connect both HDDs to my laptop and rsync from HDD_A to HDD_B. It would be nice if I could export pictures and info about the lastest load of pictures in digikam, for example packaging pictures files and DB, and them importing them into HDD_2. Summarizing: I would be nice to have a way of keeping sync diferents storages with the respectives DBs. Case (2): I friend of mine, don't use digikam, he uses and tree structure in disk and backups copies on DVDs. He does the copies on DVDs overlaping the files from DVD to DVD, so a picture is always at least in 2 o 3 (I can't remember now) discs. This gave me the idea of digikam generating k3b project files, so in a DVD could be the pictures and a "mini" database with the info about pictures. This way, you could retrieve the part of your collection that is corrupted or using all the DVDs to restore the complete collection. I think that for this, in digikam you could track where are the backup(s) of a particular picture, and when was done. Digikam could also alert the user to make a new backup following a backup politic. Summarizing: "Export" pictures+db to discs, and keeping track of where and when the pictures and info where backed up. From this, may be case (3), the following idea came to my mind: "pictures in historic mode" I know a professional photographer, he has a really large number of pictures. You know, tons of shots to then only choose 4 or 5, but he would like to keep all the pictures. As you can guess, he uses tons of HDD space, but really only need to have the last month of pictures in the HDD of work, the rest are best keeped offline on other storages, (eg: NAS, USB HDDs, etc). It would be nice to have digikam to manage this case. You can put part of your pictures in "historic mode", pictures in this mode, are not accesible to digikam but digikam could maintain the thumbnails and other info. You can browse the collection although part of the collection is not available to digikam in that moment. In some way, case 2 and 3 are similar. But historic are not a backup only a part of the collection that is parked. Also from this another idea came, but I think that is off-topic for this thread, but it is related to the work done by the person in case 3. I would be nice if I could "link" a picture form a folder to another. I new to digika, so i could miss something in digikam. Suppose I have a folder "trip to Paris" with photos, and suppose I have a folder "panoramic" with panoramic views. I wish I had the photos in "trip to Paris" and in the "panoramic" a "link" to the actual file that is in the "trip to Paris" without having to duplicate the photos in both folders. Sorry the long post and my bad english. Rafa Rios
Rafael, thanks for your input. I will cover your point in reverse order: 4) We currently work on a GSoC project to implement non-destructive editing. In this course, I plan to make it possible to record relations between images. From your panorama created with hugin, you could link to all source images. Work in progress this summer 3) You can use CDs/DVDs as a collection. It should definitely work if you give the CD a unique label when burning. When you insert the DVD, it should be recognized and appear as a collection in the albums tab. Information remains in the database. Making offline collections and their thumbnails accessible while offline is on open task. 2) Sound like an idea for a kipi plugin. I think there was a CD burning plugin once. Identifying which files were added can be based on modification date or creation date, both have their pros and cons. We dont have a date in the database when a particular picture was added. 1) You need to identify which files were added or changed, see above. This can be done by a simple search. All the rest sounds like an idea for scripting. There is also someone working bringing scripting to digikam this summer, let's wait for results.
(In reply to comment #4) > Preferably not a binary format - the whole point of having a plain text or XML > file is that, if necessary, it can be manipulated by editing or via a script in > order to fix up any incompatibilities. Once the migration to MySQL has been completed wouldn't it be possible to just use mysqldump to export the database? It basically creates a SQL script that can be used to restore the data to another MySQL DB.
Also with sqlite you can simply copy the sqlite database file for backup. It can be read with every sqlite tool.
hi, Sorry for the very long delay. Thanks for your reply Marcel, let's wait for GSoC end. Now I starting to work in a little application in python + qt4, to sync my hards disks, case 1 in my previous comment (#9). The main reason is to learn python and qt, for what I can see, the translation to C++ and qt4 is pretty straight forward. So if I reach something usable (my free time is very little) I will make it public and someone else can translate it to C++/qt4/kde/... and even integrate it in digikam.
*** Bug 113715 has been marked as a duplicate of this bug. ***
Thank you for the bug report. As this report hasn't seen any changes in 5 years or more, we ask if you can please confirm that the issue still persists. If this bug is no longer persisting or relevant please change the status to resolved.
> "...wouldn't it be possible to > just use mysqldump to export the database?" I can verify that mysqldump does, indeed, produce scripts that can be used for saving and restoring databases used with Digikam. After upgrading to 7.3.0 I lost my databases during setup. (I didn't lose them, I made a mistake in configuring Digikam). I missed the place where you set up the four individual databases and assumed you must use a single database for all of the functionality, so I used mysqldump (against MariaDB) to dump all four databases to scripts, and then used those scripts to load them all into a single new database which Digikam was configured to use. (It was at that time I found out where to set up the four separate databases and found that I didn't have to do this at all.)