Version: 3.5.8 (using KDE 3.5.8, Kubuntu (gutsy) 4:3.5.8-0ubuntu3.1) Compiler: Target: i486-linux-gnu OS: Linux (i686) release 2.6.22-14-generic This page seems to crash konqueror constantly on my machine: http://www.rpgamer.com/fanart/2007/fanart111607.html The page starts to load and then konqueror seems to use an excessive amount of memory (according to 'ps') and then eventually crash. When this occurs, the "thumbnail" images are being loaded/rendered, perhaps it's related. The rest of the site seems to be okay. I have the Macromedia Flash plugin installed as well.
Confirmed, 3.5.8 and 4.0 beta4. ==13818== Invalid write of size 4 ==13818== at 0x531384F: _XFlushGCCache (in /usr/lib/libX11.so.6.2.0) ==13818== by 0x531111B: XChangeGC (in /usr/lib/libX11.so.6.2.0) ==13818== by 0x4DF1BFB: bitBlt(QPaintDevice*, int, int, QPaintDevice const*, int, int, int, int, Qt::RasterOp, bool) (in /usr/lib/qt3/lib/libqt-mt.so.3.3.8) ==13818== by 0x4EA5D6E: QPixmap::copy(bool) const (in /usr/lib/qt3/lib/libqt-mt.so.3.3.8) ==13818== by 0x4EA5E79: QPixmap::setMask(QBitmap const&) (in /usr/lib/qt3/lib/libqt-mt.so.3.3.8) ==13818== by 0x4EA5E8B: QPixmap::setMask(QBitmap const&) (in /usr/lib/qt3/lib/libqt-mt.so.3.3.8)
The "thumbnails" are actually 3 10080x3600 images, which the browser is then asked to shrink down to 150x50 for painting. That's about 435MB of data. No browser available on Linux can decode this w/o using tons of memory. Obviously the best way of addressing this would be for the webmaster to actually put in 150x50 thumbnails. I'll shortly put in some resource limits which may help partly (since these are JPEGs, they can be subsampled), but it's basically a page where one can't do anything nicely. All the browsers available on Linux will suck memory on this thing. 4.0 will likely do better on this than any other browser, but 3.5.x probably does worse. Still, we are using more memory than we should be, I am not sure why yet.
The reason consumption is 2x is that the image is progressive, and the way progressive images work w/JPEGs libJPEG needs to store the DCT coefficients table until everything is decoded (a progressive GIF or PNG would not have this issue). Once the images is decoded that usage goes away and the consumption is the actual 400MiB or so the pixels need. There might be some tricks we can do for really bad websites such as this one, but I am not sure they're general enough to be useful. I suppose one can also pull data from libJPEG's buffers on demand, but that'd be extremely tricky. At any rate, the throughput is shot once it hits the pagefile.
Works fine on 3.5.9 and 4.0.3. No crashes at all. Page looks fine.
if i did understand maksim right, it is a wontfix then, the page looks fine now, but has probably changed.