Version: unspecified (using KDE 4.8.0) OS: Linux This message appears when I try to open a file larger than 2GiB on a 64 bit system. Okteta 0.8.0. $ file /usr/bin/okteta /usr/bin/okteta: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.9, stripped Reproducible: Always Steps to Reproduce: Open a file latger than 2GiB Actual Results: file doesn't open Expected Results: file opens OS: Linux (x86_64) release 3.2.2-pf Compiler: x86_64-pc-linux-gnu-gcc
Hi, thanks for your bug report, and sorry for the late response, it slipped my attention until now. Could reproduce and found out that I somehow made a wrong assumption, int is also 32bit on the usual 64bit systems, meh. And as currently integer-based data structures (like QByteArray, see the int parameters) are used internally, data sizes > std::numeric_limits<int>::max() are not possible yet. So the fix for now will be to adapt the error message. Going to do that in the next days. In the long run of course the internal data structures have to be fixed, Alexander has already started to work on that and I need to catch up with his work finally...
SVN commit 1286586 by kossebau: Fixes: max. size of byte arrays is qint32::max also on 64-bit systems still M +2 -1 core/address.h M +5 -6 kasten/core/io/filesystem/bytearrayrawfileloadthread.cpp WebSVN link: http://websvn.kde.org/?view=rev&revision=1286586
SVN commit 1286588 by kossebau: Fixes: max. size of byte arrays is qint32::max also on 64-bit systems still M +4 -4 bytearrayrawfilereloadthread.cpp WebSVN link: http://websvn.kde.org/?view=rev&revision=1286588
SVN commit 1286591 by kossebau: backport of 1286586 and 1286588: Fixes: max. size of byte arrays is qint32::max also on 64-bit systems still okay'ed by translators FIXED-IN: 0.8.2 (KDE Apps 4.8.2) M +4 -5 bytearrayrawfileloadthread.cpp M +4 -5 bytearrayrawfilereloadthread.cpp WebSVN link: http://websvn.kde.org/?view=rev&revision=1286591
The need to open large files (>2GB) is still tracked as bug 182577, this one here was dealt with to be only about the bad error message. So if you are interested in any progress on large files, please subscribe to that other bug.