Summary: | "Support to load files larger than 2 GiB on 32-bit systems has not yet been implemented." on a 64bit system | ||
---|---|---|---|
Product: | [Applications] okteta | Reporter: | Piotr Mitas <yabolus> |
Component: | general | Assignee: | Friedrich W. H. Kossebau <kossebau> |
Status: | RESOLVED FIXED | ||
Severity: | normal | ||
Priority: | NOR | ||
Version: | unspecified | ||
Target Milestone: | --- | ||
Platform: | Gentoo Packages | ||
OS: | Linux | ||
Latest Commit: | Version Fixed In: | 0.8.2 | |
Sentry Crash Report: |
Description
Piotr Mitas
2012-02-03 15:06:59 UTC
Hi, thanks for your bug report, and sorry for the late response, it slipped my attention until now. Could reproduce and found out that I somehow made a wrong assumption, int is also 32bit on the usual 64bit systems, meh. And as currently integer-based data structures (like QByteArray, see the int parameters) are used internally, data sizes > std::numeric_limits<int>::max() are not possible yet. So the fix for now will be to adapt the error message. Going to do that in the next days. In the long run of course the internal data structures have to be fixed, Alexander has already started to work on that and I need to catch up with his work finally... SVN commit 1286586 by kossebau: Fixes: max. size of byte arrays is qint32::max also on 64-bit systems still M +2 -1 core/address.h M +5 -6 kasten/core/io/filesystem/bytearrayrawfileloadthread.cpp WebSVN link: http://websvn.kde.org/?view=rev&revision=1286586 SVN commit 1286588 by kossebau: Fixes: max. size of byte arrays is qint32::max also on 64-bit systems still M +4 -4 bytearrayrawfilereloadthread.cpp WebSVN link: http://websvn.kde.org/?view=rev&revision=1286588 SVN commit 1286591 by kossebau: backport of 1286586 and 1286588: Fixes: max. size of byte arrays is qint32::max also on 64-bit systems still okay'ed by translators FIXED-IN: 0.8.2 (KDE Apps 4.8.2) M +4 -5 bytearrayrawfileloadthread.cpp M +4 -5 bytearrayrawfilereloadthread.cpp WebSVN link: http://websvn.kde.org/?view=rev&revision=1286591 The need to open large files (>2GB) is still tracked as bug 182577, this one here was dealt with to be only about the bad error message. So if you are interested in any progress on large files, please subscribe to that other bug. |