| Summary: | Large file missing from backup | ||
|---|---|---|---|
| Product: | [Applications] kbackup | Reporter: | pallaswept <pallaswept> |
| Component: | general | Assignee: | Martin Koller <martin> |
| Status: | REPORTED --- | ||
| Severity: | critical | CC: | sirius_b |
| Priority: | NOR | ||
| Version First Reported In: | 23.08.4 | ||
| Target Milestone: | --- | ||
| Platform: | Other | ||
| OS: | Linux | ||
| Latest Commit: | Version Fixed/Implemented In: | ||
| Sentry Crash Report: | |||
|
Description
pallaswept
2023-12-30 23:22:03 UTC
Thinking about this some more, it occurred to me that there are really two separate issues to address here: 1) The missing file 2) The log stating a success when it wasn't This reminds me of that TV show "air crash investigations" where there's never a single thing that goes wrong to cause a disaster, it has to be a chain of failures. This is that. If either one of those two things had happened without the other, it wouldn't have been so bad - I'd either have the file, or I'd have the knowledge that something went wrong and maybe I don't have the file. But the way it is, I am led to believe I have the file, and I don't, and that's a recipe for a disaster. Sorry to double the workload here, but I thought it was an important realisation that this bug is really two and needs two things fixed... Hi. Thanks for your report - jour style is fun to read :-) I tried to reproduce this issue here, but using a smaller file with ca. 65GB and some other smaller files. The backup completed here after more than 5(!) hours... But the backup archive did correctly include all files, also the large one! Also checking the source and I can't see how a file would not be included in the tar file without an error given. Just to be sure: How did you check the content of the archive after the backup? Does "tar -tvf <backup.tar>" also not show the large file? (In reply to Martin Koller from comment #2) > Hi. Thanks for your report - jour style is fun to read :-) > > I tried to reproduce this issue here, but using a smaller file with ca. 65GB > and some other smaller files. > The backup completed here after more than 5(!) hours... > But the backup archive did correctly include all files, also the large one! > > Also checking the source and I can't see how a file would not be included in > the tar file without an error given. > > Just to be sure: How did you check the content of the archive after the > backup? > Does "tar -tvf <backup.tar>" also not show the large file? Hello again Martin and thanks once again for looking into this for me. I really didn't expect a quick reply at this time of year, I hope that I have not interrupted your celebrations, and that you are enjoying the festive season. I looked for the file using dolphin at first, and then I tried ark. It was a better idea to try it with tar, thanks for finding the right arguments for me, you saved this dummy from having to read the manpage :D Unfortunately, I don't see the file listed there, either. There's obviously something 'special' about my PC that is causing this, but I have no idea what it might be. The specifics of the operation are that might be unique are that I am running Kbackup with kdesu (so I get root access to files for backup), the file is stored on a BTRFS file system and has the C attribute set, in order to disable copy-on-write (which I read as a recommendation for VM images and databases), and I am backing up to an external USB 3.1 (aka 3.2 Gen2x1 - the 10GBps variety) NVMe hard drive. I might try a few experiments and see if I can get it to include the file, and that should give us a hint. I will try without compression, and try backing up to a different destination (the same internal NVMe disk - there is enough space, and a different USB-connected spinning-rust type drive), and try without kdesu (This file doesn't need root access), and try backing up only this file (rather than using the profile I am using presently).... Hopefully one or more of those will work, and give us a hint about why it failed. I will return with some more info after I do the experiments. Dear Bug Submitter, This bug has been in NEEDSINFO status with no change for at least 15 days. Please provide the requested information as soon as possible and set the bug status as REPORTED. Due to regular bug tracker maintenance, if the bug is still in NEEDSINFO status with no change in 30 days the bug will be closed as RESOLVED > WORKSFORME due to lack of needed information. For more information about our bug triaging procedures please read the wiki located here: https://community.kde.org/Guidelines_and_HOWTOs/Bug_triaging If you have already provided the requested information, please mark the bug as REPORTED so that the KDE team knows that the bug is ready to be confirmed. Thank you for helping us make KDE software even better for everyone! > This bug has been in NEEDSINFO status with no change for at least
> 15 days.
I'm sorry for the delay here. I have several bugs open for my machine (I seem to have a talent for finding unusual broken stuff) and on top of that, I am rather ill and disabled, and it can seriously mess with my ability to meet deadlines.
I've had this tab open the entire time, and it is on my to-do list, unfortunately as I mentioned that's a long list, and I have trouble getting to it even if it's short.
As I am quite positive this is a real issue, in order to appease the bot, I am going to change the status on this case before providing the info, otherwise it'll end up getting closed and I'll just end up making a mess filing a new bug and referring back to this one, when I am able to get said info.
I have exactly the same problem when compression is enabled. In this case, kbackup compresses the files to /tmp, and /tmp doesn't have enough space on my system for the large files. Archiver::compressFile then aborts because filter.write returns 0 as soon as /tmp is full. In my case, the output also looks as if archiving the file was successful, but the "Warnings" section shows the "Could not write to temporary file" message generated in method compressFile. I would find it helpful to be able to optionally configure an alternative path for temporary files instead of /tmp (QTemporaryFile seems to offer the option to specify an arbitrary path). Thanks, rtb2000 create a shell script and set the TMPDIR env var before starting kbackup. QTemporaryFile uses QDir::tempPath() which in turn uses $TMPDIR if set, else /tmp (In reply to Martin Koller from comment #7) > create a shell script and set the TMPDIR env var before starting kbackup. > QTemporaryFile uses QDir::tempPath() which in turn uses $TMPDIR if set, else > /tmp Hello, thank you for the advice. Instead, I used the corresponding environment variable configuration for kbackup in the plasma application launcher and consider this a workaround. However, in my opinion, the problem with a tmpfs /tmp that is too small for such operations is likely to affect most desktop PCs of average users with a default installation of the operating system. In my opinion, a more user-friendly solution would be to be able to make the appropriate configuration in the program, or at least provide a clearly documented reference to TMPDIR, possibly together with the “Could not write to temporary file” warning. Nevertheless, it works now, so thank you very much. |