Summary: | [unarchiver] Crash while trying to open or extract big archives | ||
---|---|---|---|
Product: | [Applications] ark | Reporter: | Nikolay Brookstein <nikolay.brookstein> |
Component: | plugins | Assignee: | Ragnar Thomsen <rthomsen6> |
Status: | RESOLVED FIXED | ||
Severity: | crash | CC: | elvis.angelaccio, nikolay.brookstein, rthomsen6 |
Priority: | NOR | Keywords: | drkonqi |
Version First Reported In: | 16.08.2 | ||
Target Milestone: | --- | ||
Platform: | Fedora RPMs | ||
OS: | Linux | ||
URL: | https://bitbucket.org/WAHa_06x36/theunarchiver/issues/919/unarchiver-crash-while-trying-to-open-or | ||
Latest Commit: | http://commits.kde.org/ark/0b981c544e994175403504124a1e44ecfa144e33 | Version Fixed In: | 16.12.0 |
Sentry Crash Report: | |||
Attachments: | Catch bad_alloc patch |
Description
Nikolay Brookstein
2016-11-08 11:14:38 UTC
Does it crash every time? Can you attach the archive here? Yes it chrashes every time. (Have just tried again) I am using [unar.x86_64 1.10.1-1.fc25 @fedora] as the rar extractor. I can not attach archive here direct (227,7 MB > 4 MB), but I have uploaded it at my cloud instance, and here is the link: https://datacloud.nextnetworks.eu/index.php/s/6xszdJX48JKT8oX (this link will be valid till 15-11-2016) Confirmed, thanks for the archive. Ark cannot even open it, it just goes out of memory while trying to allocate the JSON object with the data provided by lsar. Thanks for your fast reaction! If I can help you to fix this bug, just say. I am not a super hacker but have some C++/Qt knowledge. (In reply to Nikolay Brookstein from comment #4) > Thanks for your fast reaction! > If I can help you to fix this bug, just say. > I am not a super hacker but have some C++/Qt knowledge. That would be very appreciated. The problem is clear: the crash happens in plugins/cliunarchiverplugin/plugin.cpp at this line: m_jsonOutput += line + QLatin1Char('\n'); which is a QString that tries to allocate too much data. That QString is then supposed to be converted to QJsonDocument. We could try to save the json output of lsar to a temporary file and get rid of this QString, but I'm not sure Qt can handle json files that large. In the meantime, honestly I suggest you try to install unrar as the unrar plugin can open this archive just fine... (In reply to Elvis Angelaccio from comment #5) > (In reply to Nikolay Brookstein from comment #4) > I'm not sure Qt can handle json files that large. We are talking about a json file bigger than 3 GB, in fact. No way a parser can handle that much data :/ Even if we skip the "XADSolidObject" json vector (which is what makes the file so big), lsar would still take too much time to finish the execution (compared to unrar). So I'm tempted to just catch the bad_alloc exception and abort the operation... What is the difference between unrar and unar approaches? It is really strange, to get over 3 GB memory allocation fo a ~240MB file. It looks like coner case or a totally inefficient implementation from unar side. Probably Ark just need a better backend error detection, and a bug is in unar. Or I am wrong here? (That are just my thoughts, I am not an expert in rar/unar/unrar) P.S. I will look in a source file tomorrow. (In reply to Nikolay Brookstein from comment #7) > What is the difference between unrar and unar approaches? > It is really strange, to get over 3 GB memory allocation fo a ~240MB file. > It looks like coner case or a totally inefficient implementation from unar > side. > Probably Ark just need a better backend error detection, and a bug is in > unar. Or I am wrong here? > (That are just my thoughts, I am not an expert in rar/unar/unrar) > > P.S. I will look in a source file tomorrow. Yes, ideally a bug should also be filed against unar. Created attachment 102135 [details]
Catch bad_alloc patch
With this patch Ark doesn't crash, at least.
I have just created a bug report at the unarchiver bug tracker: https://bitbucket.org/WAHa_06x36/theunarchiver/issues/919/unarchiver-crash-while- trying-to-open-or Elvis, "Archive too big" is probably some how confusing. Possibly "Archive can not be extracted due to insufficient free memory"? oops, the same URl without line break: https://bitbucket.org/WAHa_06x36/theunarchiver/issues/919/unarchiver-crash-while-trying-to-open-or (In reply to Nikolay Brookstein from comment #10) > I have just created a bug report at the unarchiver bug tracker: > https://bitbucket.org/WAHa_06x36/theunarchiver/issues/919/unarchiver-crash- > while- > trying-to-open-or Thanks for reporting it upstream! > > Elvis, "Archive too big" is probably some how confusing. > Possibly "Archive can not be extracted due to insufficient free memory"? Yeah, this is just a temporary patch so that I won't forget it :p No problem, without bug reports bugs will never be fixed ^_^ As far as I have understood, KDE/Plasma/KF are all Qt based, so why you are using 'i18n("Blabla")' instead of 'tr("Blabla")' Btw, you are the main maintainer of Ark right now? If yes, some offtopic question from my side, is it possible to make a progress indication in Ark, or the current architecture is not sutable for the such kind of functionality? Our translation infrastructure is based on GNU 'gettext'; we do not use the Qt translation system. (In reply to Christoph Feck from comment #14) > Our translation infrastructure is based on GNU 'gettext'; we do not use the > Qt translation system. Thank you for the clarification. Is gettext technically better than the Qt translation system? What is the reason of using it? https://techbase.kde.org/Development/Tutorials/Localization/i18n_Build_Systems does not list any technical differences, so it might be because of historical reasons. Switching the complete infrastructure was never considered. (In reply to Nikolay Brookstein from comment #13) > No problem, without bug reports bugs will never be fixed ^_^ > > As far as I have understood, KDE/Plasma/KF are all Qt based, > so why you are using 'i18n("Blabla")' instead of 'tr("Blabla")' > > Btw, you are the main maintainer of Ark right now? > If yes, some offtopic question from my side, is it possible to make a > progress indication in Ark, or the current architecture is not sutable for > the such kind of functionality? Feel free to join the #kde-utils IRC channel and we can discuss everything you want :) @Nikolay: patch up for review: https://phabricator.kde.org/D3350 Git commit 0b981c544e994175403504124a1e44ecfa144e33 by Elvis Angelaccio. Committed on 15/11/2016 at 18:23. Pushed by elvisangelaccio into branch 'master'. Stop crashing when lsar's output is too big lsar can generate huge json output when listing big solid rar archives. This will be fixed in lsar, but meanwhile we can catch a bad_alloc if we're using an affected version of lsar. FIXED-IN: 16.12.0 Differential Revision: D3350 M +3 -0 autotests/plugins/cliunarchiverplugin/CMakeLists.txt M +3 -0 plugins/cliunarchiverplugin/CMakeLists.txt M +9 -1 plugins/cliunarchiverplugin/cliplugin.cpp http://commits.kde.org/ark/0b981c544e994175403504124a1e44ecfa144e33 |