Bug 152470 - specific page crash browser (huge memory allocations)
Summary: specific page crash browser (huge memory allocations)
Status: RESOLVED INTENTIONAL
Alias: None
Product: konqueror
Classification: Applications
Component: khtml (show other bugs)
Version: 3.5
Platform: unspecified Linux
: NOR crash
Target Milestone: ---
Assignee: Konqueror Developers
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2007-11-17 16:32 UTC by Jonathan Phénix
Modified: 2008-06-20 03:59 UTC (History)
0 users

See Also:
Latest Commit:
Version Fixed In:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Jonathan Phénix 2007-11-17 16:32:25 UTC
Version:           3.5.8 (using KDE 3.5.8, Kubuntu (gutsy) 4:3.5.8-0ubuntu3.1)
Compiler:          Target: i486-linux-gnu
OS:                Linux (i686) release 2.6.22-14-generic

This page seems to crash konqueror constantly on my machine:

http://www.rpgamer.com/fanart/2007/fanart111607.html

The page starts to load and then konqueror seems to use an excessive amount of memory (according to 'ps') and then eventually crash. When this occurs, the "thumbnail" images are being loaded/rendered, perhaps it's related. The rest of the site seems to be okay.

I have the Macromedia Flash plugin installed as well.
Comment 1 Tommi Tervo 2007-11-17 18:17:10 UTC
Confirmed, 3.5.8 and 4.0 beta4.

==13818== Invalid write of size 4
==13818==    at 0x531384F: _XFlushGCCache (in /usr/lib/libX11.so.6.2.0)
==13818==    by 0x531111B: XChangeGC (in /usr/lib/libX11.so.6.2.0)
==13818==    by 0x4DF1BFB: bitBlt(QPaintDevice*, int, int, QPaintDevice const*, int, int, int, int, Qt::RasterOp, bool) (in /usr/lib/qt3/lib/libqt-mt.so.3.3.8)
==13818==    by 0x4EA5D6E: QPixmap::copy(bool) const (in /usr/lib/qt3/lib/libqt-mt.so.3.3.8)
==13818==    by 0x4EA5E79: QPixmap::setMask(QBitmap const&) (in /usr/lib/qt3/lib/libqt-mt.so.3.3.8)
==13818==    by 0x4EA5E8B: QPixmap::setMask(QBitmap const&) (in /usr/lib/qt3/lib/libqt-mt.so.3.3.8)
Comment 2 Maksim Orlovich 2007-11-21 00:35:13 UTC
The "thumbnails" are actually 3 10080x3600 images, which the browser is then asked to shrink down to 150x50 for painting. That's about 435MB of data. No browser available on Linux can decode this w/o using tons of memory.  Obviously the best way of addressing this would be for the webmaster to actually put in 150x50 thumbnails.

I'll shortly put in some resource limits which may help partly (since these are JPEGs, they can be subsampled), but it's basically a page where one can't do anything nicely. All the browsers available on Linux will suck memory on this thing. 4.0 will likely do better on this than any other browser, but 3.5.x probably does worse. Still, we are using more memory than we should be, I am not sure why yet.
Comment 3 Maksim Orlovich 2007-11-21 01:49:34 UTC
The reason consumption is 2x is that the image is progressive, and the way progressive images work w/JPEGs libJPEG needs to store the DCT coefficients table until everything is decoded (a progressive GIF or PNG would not have this issue). Once the images is decoded that usage goes away and the consumption is the actual 400MiB or so the pixels need. There might be some tricks we can do for really bad websites such as this one, but I am not sure they're general enough to be useful. I suppose one can also pull data from libJPEG's buffers on demand, but that'd be extremely tricky. At any rate, the throughput is shot once it hits the pagefile.

Comment 4 Kevin Funk 2008-04-20 16:04:52 UTC
Works fine on 3.5.9 and 4.0.3. No crashes at all. Page looks fine.
Comment 5 mario tuling 2008-06-20 03:59:47 UTC
if i did understand maksim right, it is a wontfix then, the page looks fine now, but has probably changed.