The extraction of files whose filename lenght is too long to be created in the chosen extraction directory fails with no warnings. This is particularly dangerous when several files are present in an archive and only one of few of them fail to extract since it could lead to data loss if the missed extraction is not promptly noted and the source archive is deleted. Long-name files are not common in human-created files but are routinely created by scripts following filename templates, e.g. I observed this issue when downloading a zip file with several files from sciencedirect.com. The zip downloaded contained several files whose filenames were clipped at 155chars resulting in problematic extraction under common /home/foo/downladed..paths. The archive could be extracted in / and other short paths. Reproducible: Always Expected Results: At least a warning stating 'Some of the files could not be extracted due to "file too long" error' should be present
Created attachment 91106 [details] An example zipfile containing a 150-char file fails to extract with ark in /home/dario with no warning. (ecryptfs fs)
This is due to eCryptfs limits (143 characters filename) but the same could happen on other system with longer names (>255@ext4)
Hi Dario, sorry for the delay. I'm not sure honestly how to handle this issue. The 255 characters limit of ext4 is a de-facto standard, shared also by other filesystems. The problem here is specific to the 143 character limit of eCryptfs. The ideal (but hard) solution would be to have every plugin able to detect a failed extraction and to report it to the user. A simpler solution could be a setting "warn me if a file has a name longer than eCrypt's limit", disabled by default. Or even "don't extract files whose nome is longer than eCrypt's limit". Then eCryptFS users would be supposed to enable it.
Renaming and marking this one as wish.