Summary: | proxy settings are not followed | ||
---|---|---|---|
Product: | [Unmaintained] kio | Reporter: | Matthew Harker <thundercloud> |
Component: | general | Assignee: | Thiago Macieira <thiago> |
Status: | RESOLVED FIXED | ||
Severity: | critical | CC: | adawit, ahartmetz, armindiaz, autoquote1000, azymut, berend.de.schouwer, bkn, bruno.mathieu, bugs.kde.org, chanika, daniel.moyne, dev, dizzy, dmoyne, edneymatias, fedsotto, fleischi, gentoo.integer, ggrabler, gleb, ileano.bonfa, info, janow49420, jasmin-kbugs, javier, joop_boonen, kde-maintainers, kdebug, kevin.kofler, luizluca, marcelovborro, marcosgdavid, martin.andersen, mcousin, msnkipa, muzenbaher, oyvinds, p.varet, patrick.noffke, patstew, possebaer, r78v10a07, rasasi78, rdieter, richih-kde, roa, romainguinot, sebastiankuzlak, seifert, seufert, smparrish, tim, tonesenna, victorjss, will, zahl |
Priority: | NOR | ||
Version: | unspecified | ||
Target Milestone: | --- | ||
Platform: | Debian testing | ||
OS: | Linux | ||
Latest Commit: | Version Fixed In: | 4.6.2 | |
Sentry Crash Report: | |||
Attachments: |
Merger of the previous patches
proxy support patch proxy support patch proxy support patch [debug] .xsession_errors https nok [debug] .xsession_errors https ok TCP trace of the https site i can load TCP Trace of the https site i can't load packet traces from behind a strict proxy packet traces from behind a strict proxy (take 2) Packet trace of the 404. |
Description
Matthew Harker
2008-01-14 01:54:39 UTC
Same problem on Mandriva 2008.0 and KDE4 (final) Note : I need to supply creadentials to go on the internet, NTLM auth is used. Additional note: I don't need credentials, just for the record. Unassigned ? What does it mean ? *** Bug 155757 has been marked as a duplicate of this bug. *** Reproducable on Kubuntu Hardy as well. I'm at work using an automatic proxy script that does not work anymore, neither does specifying proxy by hand. Same problem in kubuntu Gutsy within cambs. Few things to note: * The proxy env variables dialogue doesn't even remember its own settings. * I could never get cambs' auto-configuration script to work in kde3, so I always just specified by hand. doesn't work on KDE Four Live CD 1.0 either :( yes, proxies will not work at all in 4.0.0 yes, it'll be fixed asap. anyways, I think we've had enough me-too posts now :) Hehe yeah. I'll give networking a rigorous testing on my macbook once this one's been fixed, as well as other parts of KDE4, see how they hold up. Matt *** Bug 156189 has been marked as a duplicate of this bug. *** *** Bug 156402 has been marked as a duplicate of this bug. *** Chani, Is this planned for 4.0.1 or 4.1.0 ? For me, this is a showstopper. I can't go out to the internet through my corportate environment without setting up a proxy. So I'm sorry, but I now reverted to KDE3... *** Bug 157533 has been marked as a duplicate of this bug. *** Problem still exists in KDE 4.0.1 Yes, and why is it still unassigned ? Confirmed in 4.0.1,Although proxy settings exists in kioslaverc,but all internet connecting failed. doesn't work on Konqueror 4.0.1 (KDE 4.0.1 - opensuse 11 alpha 2) either Still broken in revision 778673 (24 Feb 2008) This is currently keeping me from using KDE4. It will also stop enterprise users. Also: May I suggest that KDE should honour the http_proxy, ftp_proxy, https_proxy etc environment variables by default. I know that most other desktop software is broken in this respect, but almost all command-line programs honour these variables. Assign to someone in the hope the problem gets attention http://lists.kde.org/?l=kde-devel&m=119028937827086&w=2 The above seems to have a patch for this problem reproducible on at least one machine. Commit it? That's just for the UI... The difficult part is the HTTPS proxying/SSL tunneling... I've bugged the SSL guy about it, let's hope he gets to it soon. *** Bug 159184 has been marked as a duplicate of this bug. *** Same problem on my Linux box Gentoo with KDE 4.02, and "hélas" without proxy support there is no way to go the internet inside my company with Konqueror (or any other program using kio stuff) actually Firefox works with the proxy settings) *** Bug 159509 has been marked as a duplicate of this bug. *** *** Bug 159569 has been marked as a duplicate of this bug. *** *** Bug 159741 has been marked as a duplicate of this bug. *** PCOMINUXOS: Worked KDE 3.5.9 OK then changed and same problem Yes, Same problem. Using PCLINUXOS latest prod version. Worked fisrt time used proxy but when I changed proxy settings it failed with same message. Attempted to clear proxy setting to blank but always defaults to last proxy settings. *** Bug 160923 has been marked as a duplicate of this bug. *** #12 in most hated bugs... let's rank up ! #12 is high enough, we know that this bug / missing feature is important. Instead you could start by testing the patch sent to kde-core-devel today. For all those interested in testing the patches http://www.nabble.com/Proxy-support-in-KDE4-td16740075.html I'll give it a shot on either arch linux or kubuntu. Patches work for me: kubuntu RC, kdelibs 4.0.3 from source. Konqueror using automatic proxy. The two patches overwrite each other in kioslave/http/http.cpp/HTTPProtocol::httpOpenConnection(), so I ended up with a jumble. Are you supposed to apply both patches? *** Bug 161187 has been marked as a duplicate of this bug. *** At least it won't work without the kio_http change of the first patch. I've tried just the 2nd patch now, which stops konqueror of returning an error - though, the pages are still white. I'm looking trough how to join those two patches. Would be nice if some kde dev could supply us with a properly joined patchfile (or two patchfiles which won't conflict). Created attachment 24487 [details]
Merger of the previous patches
This is my merger, that works for me (tm)
I did a similar merge, but setting the proxy did not make konqueror functional in any case, so I can not confirm the functionality of the patch (though, the patch applies). I've tried specifying the proxy myself, as well as using an automatic proxy setting (pac file). None of both work, and leave behind konqueror just not showing any pages (stays white, finished loading as it seems at least). To ensure this, I've also tried it with your patchfile in the attachment, with the same result as I experienced with my own patch-merge. Only one of the kio_http changes at a time makes sense, I think. http works. https is still broken. Konqueror's main window displays: could not connect to host <proxy's hostname>: unknown error ftp is still broken. Popup dialog displays: URL cannot be listed Reviewing the patches, it seems as if they chose just a different way to check if a proxy shall be used, while the second way seems a bit "cleaner" as the first one posted. Though, I haven't had any success on this yet, maybe there is a kio_http patch around which I missed as well, or in my revision it changed too much (which could be, I'm currently using 4.0.69 for testing purposes, as it feels much more stable than 4.0.3). Created attachment 24640 [details]
proxy support patch
Try this one (it's a modification of my previous patch). It supports socks
proxy (v5) if proxy url was set like "socks://proxy". This patch is to kio_http
only, other modifications were already submitted to source.
Created attachment 24641 [details]
proxy support patch
Fixed too early return from httpOpenConnection.
Tested the patch above, seems to work properly. Version 4.00.73 (KDE 4.0.73 >= 20080507) Either, there is a konqueror bug or still a problem with this patch. Pages, redirecting trough ssl, the connection seems to "drop". The returned page is "unknown host". In example: gmail Created attachment 24758 [details]
proxy support patch
FIX:
don't drop connection on post requests;
close connection if request's proxy url differs from state's (previous
request).
Comment on attachment 24487 [details]
Merger of the previous patches
I'm marking the "Merger of the previous patches" patch as obsolete: for the
kio_http part, the newer patches by Sergey Saukh look more complete to me, and
the changes to the other files have already been applied in 4.0.4.
Norman R. Weathers reports on https://bugzilla.redhat.com/show_bug.cgi?id=443931#c12 : > I have responded to > http://admin.fedoraproject.org/updates/F9/FEDORA-2008-3664, > but I thought I would put it here as well. The fixes do allow my konqueror > to now be able to use the web proxy, but it spins up kded4 process to near > 100% during the request, and it can be painfully slow compared to firefox. > It actually causes several things within the KDE interface to become > unresponsive while it is processing the proxy request and subsequent download > of the requested data. (We're using the latest patch (from comment #46) in the referenced update.) Such strange behaviour of kded4 can be caused by two modules - favicons and proxyscout. Does he use proxy autoconfiguration? If he does, does manual setting of proxy help? Problem still in Version 4.00.80 (KDE 4.0.80 >= (KDE 4.1 Beta1) "release 4.1" using the same settings as in konqueror 3.5.9 I changed the severity to critical because it makes konqueror unusable in many business environments and is a show stopper Error screen: The requested operation could not be completed Connection to Server Refused Details of the Request: URL: http://www.kde.org Protocol: http Date and Time: Thursday 29 May 2008 16:13 Additional Information: www.kde.org: Unknown error Description: The server www.kde.org refused to allow this computer to make a connection. Possible Causes: The server, while currently connected to the Internet, may not be configured to allow requests. The server, while currently connected to the Internet, may not be running the requested service (http). A network firewall (a device which restricts Internet requests), either protecting your network or the network of the server, may have intervened, preventing this request. Possible Solutions: Try again, either now or at a later time. Contact the administrator of the server for further assistance. Contact your appropriate computer support system, whether the system administrator, or technical support group for further assistance. *** Bug 163064 has been marked as a duplicate of this bug. *** @Ferdinand Gassauer Because proposed patch still not comitted... The proposed patch is broken, sorry to say. If the proxy type is HTTP, kio_http should set no proxy in Qt, since it handles proxying on its own. On the other hand, it is a good start. Does anyone know when it will be fixed? I'm currently using KDE 4.0.80 and it still doesn't work. I hope that issue will be resolved _before_ KDE 4.1 Final is released, because it makes KDE4 totally unusable for business environments, but also for most students as the majority of universities require a proxy. There's no telling "when". It depends on when I have free time and the will to work on this task. Or someone can post another patch and I'll happily analyse it. @Thiago If the proxy type is HTTP, kio_http sets QNetworkProxy::NoProxy in Qt and handles proxying on its own. It sets proxy in Qt only for HTTPS and SOCKS requests. It does what you say. I don't see why it is broken. kio_http should be able to handle HTTP proxies for HTTPS requests too. Well, it used to. But, come to think of it, why not let Qt handle that one? I'll have a second look at the latest patch under that light. I don't think that it is a good idea to let Qt handle HTTP proxies, because Qt uses tunelling (i.e. CONNECT) in QtNetworkProxy::HttpProxy mode. So it is suitable only for HTTPS, not HTTP (if proxy allows CONNECT to 443 port only, for example). Sorry for not being precise. I meant the tunnelling (transparent, not caching) proxy mode could be handed over to Qt. We only need to implement the proxying for Caching HTTP proxies. > kio_http should be able to handle HTTP proxies for HTTPS requests too. Well,
> it used to.
But Sergey's patch is a definite improvement over the status quo which is no proxy support at all. We're shipping it in Fedora 9 and it appears to be working just fine.
As far as I can see, caching HTTP proxies are working just fine right now. There should be some cleanup done in the code, maybe some improvements too, but everything else is working ok. BTW, if proxy configuration dialog could be changed to allow proxy url to be set in form 'socks://proxy_address', then kio_http will handle SOCKS5 proxies as well. Thus we can have different type of proxies per protocol (Caching for HTTP, tunelling for https, and socks for FTP, for example)... The first connection seems to work, but all subsequent connections fail proxy authentication. For some reason the manual proxy configuration disappeared in the latest svn versions in the konqueror preferences. Removing the feature is not a solution - at least not a good one ;-). Anyway, i can confirm what bds wrote - there are still unsolved problems with the patch. Correction: subsequent connections seem to ignore proxy settings. Closing and opening konqueror doesn't help. Initial connection works fine -- including downloading embedded images, etc. I assume it's a persistent connection. Related: proxy authentication dialog does not pop to the front like a password dialog when making a request, so unless you watch the panel, konqueror just sits there staring at you, and you think it's broken. (patch applied to KDE 4.1 beta 1 -- not SVN -- don't have bandwidth.) I've tried automatic proxy, and http_proxy environment variables. Both exhibit "work once, not twice". I won't have time to work on this for the next three weeks. That means not ready for KDE 4.1 unless someone takes over. And what about the past 6 months ? :-( Number three of "The most hated bugs". But nevermind, untill that will be fixed its just a unusable memory-eater and as such not installed ;) I still think its a bad joke... no proxy-support in kde4... but kde4 is a mess since it got released... waiting for 4.2 *lol* Just use a patched KDE 4 (such as the one in Fedora). (Sorry Thiago, I know the patch is far from perfect, but it's definitely better than no proxy support at all.) In the past 6 months, I was busy trying to get Qt 4.4.0 released, attending conferences and participating in the integration of Trolltech and Nokia. So, no, proxy support was not high up in my list of To-Do tasks. Hi Thiago! I don't think anybody -- well, anybody with half a clue anyway -- should be expecting you to fix the damn proxy issue all by yourself in addition to all the excellent (if obviously time-consuming) work you've done in Qt lately. Do you think, maybe, there could be a post anywhere that the Planet would pick up, asking for additional workforce on this? Other parts of the projects have met reasonable amounts of success with this approach, I think. Thanks again for your patience, and all your good work in Qt. A 4.1.1 bugfix release will also be around (for sure, never saw a .0 without a .1 following) Bugfixes should be able to get in there. If Thiago does not gat around getting this fixed, it probably gets some attention in a bug squad. The bug exists half a year - and as it seems you could survive half a year without proxy support also.. I can, and I'm in a "http proxy only" environment... means firefox, which is a pretty good workaround. Additional: I've not had too much luck with the fedora patches also. It "works", but not always, and not stable. I'll better wait for a real fix than using something like that. Andreas Hartmetz has committed this just before 4.1 branched: http://websvn.kde.org/?view=rev&revision=829797 As a work-around, try something along the lines of this: http://www.faqs.org/docs/Linux-mini/TransparentProxy.html#s6 My friend says he does something like this with tinyproxy. I don't know how well it works, but I'll test it when I get back to university in october. If anyone else gts it working in the meantime, post on here. HTTP Proxy support works cool for me now in KDE 4.1 RC1, as in Konqueror as in Akregator, for instance. I would like to mark this as WORKSFORME or FIXED, but this is not "my" bug :-) Definitely can't confirm this. This was a MINIMAL fix applied, and with 4.1 rc1 running, with the same issues (work once, not twice) etc. So, I'd rather let it open until the proxy support was really fixed. (using KDE 4.1.00) HTTP works fine when using Konqueror or Akregator, but I encounter an error as soon as I try to browse a website using HTTPS. Proxy support for Kopete and Amarok2 doesn't work at all. Well, I don't encounter any problems with HTTPS in general. The problem are still "nested" connections. I expect this also to be the problem with gmail. I always get the "Unknown Host" error when I try to connect to gmail (I don't know why though). I can't reproduce any errors using a proxy in Konqueror (unauthenticated, Squid proxy). I'll keep on browsing the web using the proxy to see if I can reproduce any errors. Using KDE trunk (r845780) I can use konqueror and akregator, but kget is not working behind proxies. The request from kget do not reach the proxy server. I'm using KDE 4.1 packages from OpenSuse 11.0 Forgot to say that I'm behind an authenticated proxy ( squid ). Should I open a new bug for kget or this problem is related with this bug? Did somebody else reproduced the kget problem? I'm behind an unauth squit proxy. Still, have my struggles with gmail, also in the given revision above (r845780, where the tester claims it works properly). I will have to investigate further, as soon as my notebook is up and running again I'll compile the latest SVN version. https stops working when I enable proxy for http in Konqueror 4.1.64. MSN stops working in Kopete when I enable http proxy support in Konqueror. (the problems outlined in comment #77 are still there) *** Bug 170670 has been marked as a duplicate of this bug. *** *** Bug 170722 has been marked as a duplicate of this bug. *** I don't know if this is related (I reported bug #170722 which was marked duplicate of this) but what I experience is that when I set SOCKS support in the proxy settings, then every time I enter the settings I get a popup message saying I need to restart (although I just did...). Also the SOCKS configuration dialog is not able to load the proxy library (I click on "Test" and it errors) and obviously SOCKS doesn't work in any KDE application (in KDE 3.5.x it worked on any KDE application other than konqueror but that was still fine). I am using kde 4.1.1 (it did the same in 4.1.0). *** Bug 170486 has been marked as a duplicate of this bug. *** Hello all, In the release notes of Qt 4.4.2, I found this line: "[217091] Fixed a bug that made the HTTP backend issue CONNECT commands for HTTP (not HTTPS) requests to proxy servers" Does this mean we can now delegate all the (non-caching) proxying to Qt, and then declare mission accomplished, go home and have a beer? (Not that Thiago doesn't deserve a beer regardless, mind you!) No, that's completely separate subject. It's API we don't use in kio. Hi , I have had a similar discussion on the kde-devel lists : http://lists.kde.org/?l=kde-devel&m=122151654802840&w=2 I thought the problem was resolved in Fedora 4.1.1 , but it's not. It's working when i use an authenticated squid proxy with a wired connection, but it's not when i use a wireless connection to a no auth proxy which itself forwards to another non auth proxy. It's a rather annoying issue, since i have to use many workarounds waiting for this to be fixed : * use a mail tunnel to get mail in KMail (then checking them on localhost is no problem since it does not need any proxy) * use proxychains to inject proxy support to Kopete * use Firefox for https websites or add the IP in /etc/hosts, then Konqueror is able to reach it, except for example GMail which never resolves to the same IP. * akregator cannot check feeds from an https website, unless i had the IP in /etc/hosts :( Well, just use a nonauth squid proxy, and try to surf (using konqueror) to gmail. Should be enough to get an error which shows it isn't working with nested connections as well. As it seems, this bug won't be fixed for 4.1.2 as well. I'm hoping it'll be fixed for 4.2 at least. *** Bug 172813 has been marked as a duplicate of this bug. *** I do not suffer from this bug anymore. Using Mandriva 2009.0, all proxy connections seem to be fine for me, authentication is asked when needed. Vincent, do you know if that is a Mandriva-specific fix? While someone pokes this issue, it would be great if you could consider http://bugs.kde.org/show_bug.cgi?id=155383 and http://bugs.kde.org/show_bug.cgi?id=155385 as well. It would tie in nicely & be a great feature for all users :) When I posted comment #1, HTTP proxying with authenticating proxy didn't work and now it is. But after reading some of the comments above, it seems this bug is more about HTTPS through a proxy and I haven't tested that yet. I'm going to test HTTPS and FTP and will report the results here. Mandriva 2009.0 is shipped with KDE 4.1.2 but if I look at the changelog, there doesn't seem to be any specific fix / workaround for this bug. Kget still do not work with auth proxy. Does somebody have the same problem? im just testing with r881602. while https did work with proxy in kde 4.1.2, it doesnt work now, http works though. so could be the same/another regression? Both HTTP (like Squid) and SOCKS proxies should generally work in trunk or, later, 4.2. We're working on the remaining problems :) KDE 4.2 Beta 1 gives through squid, manual and automatic proxy: http works. https fails: (does ask username and password) konqueror(12909) FixHostUriFilter::filterUri: FixHostUriFilter::filterUri: KUrl("https://gmail.com/") UNEXPECTED RESPONSE: [HTTP/1.0 404 Not Found Server: squid/2.5.STABLE5 Mime-Version: 1.0 Date: Thu, 04 Dec 2008 13:06:05 GMT Content-Type: text/html Content-Length: 1182 Expires: Thu, 04 Dec 2008 13:06:05 GMT X-Squid-Error: ERR_DNS_FAIL 0 Squid proxy logs says it's connecting to the IP directly, instead of the host name: CONNECT 209.85.171.83:443 CONNECT 64.233.161.83:443 instead of CONNECT gmail.com Please upgrade to Qt 4.5 to get connections to hostnames. I use KDE 4.1.3 in OpenSuse 11.1 RC1 http and https works well throught proxy with auths, but ftp does not work at all! I have error message "URL cannot be listed". Hav we got some solution for this problem? Nested connections are still not fixed. I don't know why this was considered as "fixed" when it seems as if it wasn't even fully tested. I'm not using any socks proxy, but a normal squid http proxy without authenticaiton. === Error case, gmail === konqueror(6290) FixHostUriFilter::filterUri: FixHostUriFilter::filterUri: KUrl("http://www.gmail.com") konqueror(6290) KonqMainWindow::openFilteredUrl: url "www.gmail.com" filtered into KUrl("http://www.gmail.com") konqueror(6290) KonqMainWindow::openUrl: url= KUrl("http://www.gmail.com") mimeType= "" _req= "[typedUrl=www.gmail.com newTabInFront]" view= QObject(0x0) konqueror(6290) KonqMainWindow::openUrl: Creating new konqrun for KUrl("http://www.gmail.com") req.typedUrl= "www.gmail.com" konqueror(6290)/kparts KParts::BrowserRun::scanFile: KUrl("http://www.gmail.com") konqueror(6290)/kio (Slave) KIO::Slave::createSlave: createSlave "http" for KUrl("http://www.gmail.com") konqueror(6290)/kio (KIOConnection) KIO::ConnectionServer::listenForRemote: Listening on "local:/tmp/ksocket-n501931/konquerorAD6290.slave-socket" konqueror(6290)/kio (KIOJob) KIO::TransferJob::slotRedirection: KUrl("http://www.gmail.com/") konqueror(6290) KonqRun::slotRedirection: KUrl("http://www.gmail.com") -> KUrl("http://www.gmail.com/") konqueror(6290)/kio (KIOJob) KIO::TransferJob::slotRedirection: KUrl("http://mail.google.com/mail/") konqueror(6290) KonqRun::slotRedirection: KUrl("http://www.gmail.com/") -> KUrl("http://mail.google.com/mail/") konqueror(6290)/kio (KIOJob) KIO::TransferJob::slotRedirection: KUrl("https://www.google.com/accounts/ServiceLogin?service=mail&passive=true&rm=false&continue=http://mail.google.com/mail/?ui=html&zy=l&bsv=1k96igf4806cy<mpl=default<mplcache=2") konqueror(6290) KonqRun::slotRedirection: KUrl("http://mail.google.com/mail/") -> KUrl("https://www.google.com/accounts/ServiceLogin?service=mail&passive=true&rm=false&continue=http://mail.google.com/mail/?ui=html&zy=l&bsv=1k96igf4806cy<mpl=default<mplcache=2") konqueror(6290)/kio (Slave) KIO::Slave::createSlave: createSlave "https" for KUrl("https://www.google.com/accounts/ServiceLogin?service=mail&passive=true&rm=false&continue=http://mail.google.com/mail/?ui=html&zy=l&bsv=1k96igf4806cy<mpl=default<mplcache=2") konqueror(6290)/kio (KIOConnection) KIO::ConnectionServer::listenForRemote: Listening on "local:/tmp/ksocket-n501931/konquerorbc6290.slave-socket" konqueror(6290)/kio (KIOJob) KIO::SlaveInterface::dispatch: error 114 "Host not found" konqueror(6290)/kparts KParts::BrowserRun::slotBrowserScanFinished: 114 konqueror(6290) KonqRun::handleError: KonqRun::handleError error: "Unknown host Host not found" konqueror(6290) KonqMainWindow::openView: "text/html" KUrl("error:/?error=114&errText=Host not found#http://www.gmail.com") childView= KonqView(0x8b17d50) req: "[typedUrl=www.gmail.com newTabInFront forceAutoEmbed]" konqueror(6290) KonqFactory::createView: Trying to create view for "text/html" "" konqueror(6290)/kdecore (trader) KMimeTypeTrader::query: query for mimeType "text/html" , "Application" : returning 4 offers konqueror(6290)/kdecore (trader) KMimeTypeTrader::query: query for mimeType "text/html" , "KParts/ReadOnlyPart" : returning 3 offers konqueror(6290) KonqFactory::createView: "khtml" : X-KDE-BrowserView-AllowAsDefault is valid : false konqueror(6290)/kdecore (KLibLoader) findLibraryInternal: plugins should not have a 'lib' prefix: "libkhtmlpart.so" konqueror(6290)/kdecore (KLibLoader) findLibraryInternal: plugins should not have a 'lib' prefix: "libkhtmlpart.so" konqueror(6290)/kdecore (KLibLoader) kde4Factory: The library "/usr/lib/kde4/libkhtmlpart.so" does not offer a qt_plugin_instance function. konqueror(6290)/khtml KHTMLFactory::KHTMLFactory: KHTMLFactory(0x90eb868) konqueror(6290)/kparts KParts::Plugin::pluginInfos: found KParts Plugin : "/usr/share/apps/khtml/kpartplugins/kget_plug_in.rc" konqueror(6290)/kparts KParts::Plugin::pluginInfos: found KParts Plugin : "/usr/share/apps/khtml/kpartplugins/khtmlkttsd.rc" konqueror(6290)/kparts KParts::Plugin::loadPlugins: load plugin "khtml_kget" konqueror(6290)/kdecore (KLibLoader) kde4Factory: The library "/usr/lib/kde4/khtml_kget.so" does not offer a qt_plugin_instance function. konqueror(6290)/kparts KParts::Plugin::loadPlugins: load plugin "khtmlkttsdplugin" konqueror(6290)/khtml (part) KHTMLPart::~KHTMLPart: KHTMLPart(0x8b17e08) konqueror(6290)/kparts KParts::Part::~Part: deleting widget QWidget(0x8b3d3f0) "" konqueror(6290) KonqView::openUrl: url= KUrl("error:/?error=114&errText=Host not found#http://www.gmail.com") locationBarURL= "error:/?error=114&errText=Host not found#http://www.gmail.com" konqueror(6290)/khtml (part) KHTMLPart::openUrl: KHTMLPart(0x8ac5900) opening KUrl("error:/?error=114&errText=Host not found#http://www.gmail.com") konqueror(6290)/khtml (part) KHTMLPart::htmlError: errorCode 114 text "Host not found" === OK Case, google === konqueror(6290) FixHostUriFilter::filterUri: FixHostUriFilter::filterUri: KUrl("http://www.google.at/") konqueror(6290) KonqMainWindow::openFilteredUrl: url "http://www.google.at/" filtered into KUrl("http://www.google.at/") konqueror(6290) KonqMainWindow::openUrl: url= KUrl("http://www.google.at/") mimeType= "" _req= "[typedUrl=http://www.google.at/ newTabInFront]" view= QObject(0x0) konqueror(6290) KonqMainWindow::openUrl: Creating new konqrun for KUrl("http://www.google.at/") req.typedUrl= "http://www.google.at/" konqueror(6290)/kparts KParts::BrowserRun::scanFile: KUrl("http://www.google.at/") konqueror(6290)/kparts KParts::BrowserRun::slotBrowserMimetype: found "text/html" for KUrl("http://www.google.at/") konqueror(6290) KonqMainWindow::openView: "text/html" KUrl("http://www.google.at/") childView= KonqView(0x8b17d50) req: "[typedUrl=http://www.google.at/ newTabInFront forceAutoEmbed]" konqueror(6290) KonqFactory::createView: Trying to create view for "text/html" "" konqueror(6290)/kdecore (trader) KMimeTypeTrader::query: query for mimeType "text/html" , "Application" : returning 4 offers konqueror(6290)/kdecore (trader) KMimeTypeTrader::query: query for mimeType "text/html" , "KParts/ReadOnlyPart" : returning 3 offers konqueror(6290) KonqFactory::createView: "khtml" : X-KDE-BrowserView-AllowAsDefault is valid : false konqueror(6290)/kdecore (KLibLoader) findLibraryInternal: plugins should not have a 'lib' prefix: "libkhtmlpart.so" konqueror(6290)/kdecore (KLibLoader) findLibraryInternal: plugins should not have a 'lib' prefix: "libkhtmlpart.so" konqueror(6290)/kdecore (KLibLoader) kde4Factory: The library "/usr/lib/kde4/libkhtmlpart.so" does not offer a qt_plugin_instance function. konqueror(6290) KonqView::changePart: Reusing service. Service type set to "text/html" konqueror(6290) KonqView::openUrl: url= KUrl("http://www.google.at/") locationBarURL= "http://www.google.at/" konqueror(6290)/khtml (part) KHTMLPart::openUrl: KHTMLPart(0x8ac5900) opening KUrl("http://www.google.at/") konqueror(6290)/kio (Scheduler) KIO::SchedulerPrivate::findIdleSlave: Resume metadata is "" konqueror(6290)/kio (Scheduler) KIO::SchedulerPrivate::findIdleSlave: HOLD: Reusing held slave for KUrl("http://www.google.at/") konqueror(6290)/khtml (html) DOM::HTMLDocumentImpl::changeModes: using compatibility parseMode between, this was tested in a quite current trunk snapshot (December, 21st). Version 4.1.86 (KDE 4.1.86 (KDE 4.2 >= 20081221)) between or btw? I will take your word for it and re-open the bug so it gets more exposure once again. I won't. Sorry. Your trace indicates that the request to http://www.gmail.com was properly sent and then redirected to https://www.google.com/accounts/ServiceLogin?[more stuff], which then failed. That means http worked and https didn't. Please upgrade to Qt 4.5 (currently in beta) where I fixed this bug about 4 months ago. Interesting. I'll give that a shot and rebuild QT this evening, so I can test this tomorrow at work. It's certainly tested against QT 4.4.3. Hopefully QT 4.5 will release sooner or later when KDE 4.2 is out :-). Thiago: Argh, sorry! What about ftp? It doesn`t work today! Well, it still doesn't work using QT 4.5.0-beta1. Is it supposed to work with the beta1? Yes, it is. Can you confirm that the kio_http process that you launched are running with the Qt 4.5.0 beta1 that you compiled? Please killall kio_http, browse to a website requiring https and then run "pmap `pidof kio_http` | grep QtNetwork". The listing should point to the Qt 4.5.0 beta1 files you've just built. Also note that I won't have time to try this out myself for the next week, until I'm back in the office. Hmh, I think I can confirm it running with the QtNetwork version 4.5 b7646000 836K r-x-- /usr/lib/libQtNetwork.so.4.5.0 b7717000 12K rwx-- /usr/lib/libQtNetwork.so.4.5.0 b7646000 836K r-x-- /usr/lib/libQtNetwork.so.4.5.0 b7717000 12K rwx-- /usr/lib/libQtNetwork.so.4.5.0 b7646000 836K r-x-- /usr/lib/libQtNetwork.so.4.5.0 b7717000 12K rwx-- /usr/lib/libQtNetwork.so.4.5.0 b7646000 836K r-x-- /usr/lib/libQtNetwork.so.4.5.0 b7717000 12K rwx-- /usr/lib/libQtNetwork.so.4.5.0 b7646000 836K r-x-- /usr/lib/libQtNetwork.so.4.5.0 b7717000 12K rwx-- /usr/lib/libQtNetwork.so.4.5.0 b7646000 836K r-x-- /usr/lib/libQtNetwork.so.4.5.0 b7717000 12K rwx-- /usr/lib/libQtNetwork.so.4.5.0 Before I forget to point out what's going wrong: I can browse in example on https://online.bankaustria.at (hopefully that one really makes use of https, since I even can log in :D). I can't browse on http://www.gmail.com or http://www.google.com/mail which works perfectly using firefox and the same proxy. In konqueror I still get Unknown Host in this case. Also, I set up a small svn repo on my server (no official certificate, but for testing) using webdav which can be viewed using http and https. http://svn.stiat.net - works https://svn.stiat.net - does not work, with the same "Unknown Host" error. This svn server of mine uses a rewrite rule as well as it's a virtual host on my machine. Maybe it's an error somewhere deeper, since i trust my bank that onlinebanking actually makes use of https. Maybe it just did not step up on the first tests. Please try this: 1) make sure you've built kdelibs with debugging information turned on 2) make sure you've started that KDE 3) make sure the kio_http debug area is enabled (use kdebugdialog to verify and change) 4) browse to an HTTPS website that works 5) killall kio_http 6) browse to an HTTPS website that doesn't work 7) open ~/.xsession-errors and find the debug output matching items 4 to 6. Attach everything here. Also, using Qt 4.5. I'm not going to fix Qt 4.4 problems anymore. Created attachment 29779 [details]
[debug] .xsession_errors https nok
The xsession debug output with kdelibs-debug and kio_http_debug enabled.
Created attachment 29780 [details]
[debug] .xsession_errors https ok
.xsession_errors with kdelibs-debug and kio_http_debug enabled, browsing a working https page.
Hi Georg I see it didn't work, but I don't see the reason why in your logs. I'll need you to go one step further, by capturing your traffic. tcpdump -pnXs 1500 -w /tmp/trace port 8080 Do that for both websites (the one that worked and the one that didn't). Don't worry about leaking sensitive content, since the connections are encrypted. However, this may leak the proxy password, if you're using one. If you are using a password, send the dumps directly to me please. If your proxy doesn't require a password (simple IP range checking), please attach to the bug report. Created attachment 29782 [details]
TCP trace of the https site i can load
TCP Dump of a https site I can load.
Created attachment 29783 [details]
TCP Trace of the https site i can't load
The tcp trace of the nok page is quite short in this case. Hopefully it helps a little (I can't even read what it's doing). The proxy does not need any kind of authentication, so I posted the traces here. Kind regards, Georg Yeah, it's very short: it's empty :-) But that's noteworthy, especially considering the one that works is connecting by IP, not by hostname like I expected it to. I guess the reason it is broken again is bug 162600 (the patch that was applied to "work around" the issue makes the application do hostname resolution and cache the results, instead of letting the ioslaves do the work). So my guess is that your DNS servers do not allow resolving all hostnames, but only a select few. The websites that work are in those that it resolves for you. And that's the reason I had to do a major incursion in QtNetwork code 4 months ago. Anyways, I'll try to mess with the code when I'm back in the office next week. That's possible that it connects by IP and not by DNS, since I'm working in the iT subcompany of two banks, means our DNS knows those hosts and does not need to look them up further. I just checked that again, all all https which are in our network can be loaded, everything on the web can't (gmail, paypal, my https). I won't be in my office next week (till the 7th of january), but if you need any further information or patches tested, the office is "just around the corner", means not more than 30 minutes away. Kind regards, Georg Just curious about the following: Does Konqueror always try to resolve the domain before it uses the proxy to connect? In our network, we have the following setup as I know: 1.) Direct connections to LAN. Means, our DNS on the workstations can only resolve addresses, which are on our lan - the DNS does not know any other domains than those internal ones. 2.) External connections use the proxy. The proxy tries to resolve the domain, and if it can't it returns an error page to the client computer. So, if I'm not mistaken (i'm not the geek in this kind of things), the kio slave using the proxy should never try to resolve a host when a proxy server is set. Instead, the client should just send the request to the proxy, who does the work of resolving, receiving and forwarding to the client. Is there any case, where the domain needs to be resolved if a proxy is used? Basically, if I set a proxy, the proxy should always be used - except for those where I defined exceptions (the list of exceptions, where no proxy should be used). I'll walk trough the kde tutorials of building kde / qt in the latest trunk versions in my vacation. Till now I was lucky enough that kdemod provides svn packages and -debug versions of those. Right now, what is happening is that there is a DNS resolution made before the connection to the proxy is attempted. Then we give the proxy the destination's IP address. What I'm going to do is fix that to send the hostname to the proxy server. That would be "better" anyway, always passing the host / domain (except for the exceptions defined) to the proxy. Considering virtual hosts by name (apache), connect by ip address would give a wrong result. The only place I can think of where it's OK for a proxy client to resolve DNS before using the proxy is where an automatic proxy (WPAD) is in use. The automatic proxy might list "no proxy for 192.168.*.*", or somesuch rule. However, that's for the TCP connect(), not the HTTP GET. The virtual host argument means that you should always use a hostname in GET http://host/page http/1.1. You should never use an IP in the GET request unless the user typed an IP in the address bar. It doesn't matter if it's a direct or proxy connection: always use the hostname if you've got one. For the automatic proxy, it's debatable whether this is an error in the WPAD file anyway. In this case, the WPAD should list "no proxy for localdomain.com", or the user should be using an IP in the URL. But if you can, for sure do DNS before TCP connect(). (On our network, DNS requests work, because ping and some other connections are nat-ed, to enable users to do some support and network diagnostics.) I agree completely with you guys. But this is not HTTP GET: this is HTTP CONNECT. Doesn't make a difference. HTTP CONNECT is listed as an HTTP method in RFC 2616, section 9. (although it's reserved for tunneling, but not clearly specified) RFC 2616 section 9 also states that "The Host request-header field (section 14.23) MUST accompany all HTTP/1.1 requests." Section 14.23 states that "The Host field value MUST represent the naming authority of the origin server or gateway given by the original URL." Requirement for virtual hosting. RFC correct behaviour is to use the hostname (as given in the URL), not IP. Even for CONNECT. This raises an interesting point on the difference between a Host: request header, and specifying the Host as part of the URI, because the header is usually omitted in CONNECT, but I still read it as correct to specify the name, and not the IP. Hi! Does anybody use 4.2 RC behind http proxy with authorization? I have weird problem: koi_http always asks for porxy username and password, but they are already stored in kwallet, see my bugreport here https://bugs.kde.org/show_bug.cgi?id=179947 Can this be related to fixes for this bug, that went into 4.2? In 4.1 kio_http used proxy username/password from kwallet. Sorry, I don't have an auth proxy to test your behaviour, so I will not be able to provide any debug information on this. @Thiago: Anything new on this topic? Any way to provide you help? Kind regards, Georg Sorry, still too busy with Qt 4.5 stuff. Besides, when that is out, we'll require an update in the proxy code anyways. still there Ubuntu/Jaunty RC QT 4.5.0 KDE 4.2.2 Same here. Fedora 10 Qt 4.5.0 KDE 4.2.2 Can't get to any https website with Konqueror :( I hope that someday Konqueror can be as good as it was in 3.5.9. That's probably because this isn't solved. I think Thiago will flag this one as fixed, as soon as this problem is fixed (or who ever will fix this). Comments that it's not working .. well, they know it's not working ;-)... allthough, it's annoying, and some day somebody will take care of this. The only thing you can do is - use an other browser meanwhile. If you can read this, Konqueror is working with an HTTP proxy requiring (Basic) authentication. My NTLM server is not working and NTLMv2 is not supported. There are no plans to add NTLMv2 support, unless someone contributes a patch (to QAuthenticator). I tested Konqueror from trunk (kdelibs 948588, kdebase 948639), using Qt 4.5.1. Network traffic indicates that the proper GET commands are sent for non-HTTPS webpages. I will continue browsing the net for a few days with Basic Auth and, if nothing appears, I will re-close this bug. I'd like to test this too... if anonsvn works some day again :D. Is QT 4.5.1 required? No, there are no changes to Qt since 4.5.0 that would make it work. So I have to believe 4.5.0 works too. As for anonsvn... that's a bug in Subversion software. Our sysadmins are working towards finding a solution (see Adriaan's blog). (In reply to comment #136) > My NTLM server is not working So how can you test ? > Network traffic indicates that the proper GET commands are sent for non-HTTPS > webpages. Could you be a bit more specific : HTTP works but HTTPS does not ? Have you tried FTP as well ? >> My NTLM server is not working >So how can you test ? I didn't. I have not tested NTLM. I have no idea if it's working. But there's no reason to believe it's not: the code was working before our server broke and hasn't changed. And since it's exactly the same interface as the Basic authentication (which is working), it should work too. > Could you be a bit more specific Yes: HTTP and HTTPS work. HTTP webpages send GET commands to the proxy server; HTTPS webpages send CONNECT commands to the proxy server. FTP-over-HTTP is not working: konqueror(16869)/kio (Slave) KIO::Slave::createSlave: createSlave "http" for KUrl("ftp://ftp.trolltech.com") konqueror(16869)/kio (KIOConnection) KIO::ConnectionServer::listenForRemote: Listening on "local:/tmp/ksocket-tmacieir/konquerorG16869.slave-socket" klauncher(4760)/kio (KLauncher) KLauncher::requestSlave: KLauncher: launching new slave "kio_http" with protocol= "http" args= ("http", "local:/tmp/ksocket-tmacieir/klauncherMT4760.slave-socket", "local:/tmp/ksocket-tmacieir/konquerorG16869.slave-socket") kdeinit4: Got EXEC_NEW 'kio_http' from launcher. kdeinit4: preparing to launch klauncher(4760)/kio (KLauncher) KLauncher::processRequestReturn: "kio_http" (pid 17867) up and running. konqueror(16869)/kio (KIOJob) KIO::StatJobPrivate::slotRedirection: KUrl("ftp://ftp.trolltech.com/") konqueror(16869) KProtocolManager::slaveProtocol: slaveProtocol: "http" ASSERT: "m_protocol == u.protocol().toLatin1()" in file /home/tmacieir/src/kde4/KDE/kdelibs/kioslave/http/http.cpp, line 520 kioslave: ####### CRASH ###### protocol = http pid = 17867 signal = 6 Thanks, it confirms what I see here (with an NTLM proxy). In this case, would you still close this bug report even if "FTP over HTTP" doesn't work ? I hope not ! Qt 4.5.2, KDE 4.3rc1 Proxy basically works, https proxy works. FTP over HTTP does not. Nested connections still do not work (see gmail). Calling it still results in "unknown host". Allthough, that could be a kio or konqueror bug, since proxy with arora just works fine (and i think they're using qt as well for the proxy stuff). So I guess that one is not completely fixed, but a part is fixed, so we can at least do a bit websurfing in konqueror. Ops, my mistake. HTTPS does not work at all. Will try rc2 today (when compiling is done). HTTPS websites over HTTP proxies work for me. They have for months. What do you mean by nested connections? Georg can you test SOCKS proxy support? (the easiest way to have a SOCKS server without installing almost anything is just have openssh client and login somewhere remotely with "-D <localport>" ssh parameters and then configure your SOCKS client to use that port on the local machine. The SOCKS connections should then exit the remote machine) I can test socks5, since we have a socks5 proxy at work as well. I'll do so on Monday. Thiago: Still, I seem to suffer from the resolving problem discussed above with you (comment #123). I can only resolve some things locally, but the proxy server actually resolves the adresses for me. I think I get the "unknown host" since my system indeed does not know the host. It's often expected, that the name resolution is done by the proxy server. Therefore, doesn't work for me. And I think this is a bug which should be fixed before closing this one. Of course it's a different setup than you use, but it's a possible setup. The problem you're seeing is the result of a separate bugfix, a workaround added because of broken routers. The applications cache the host name resolution. I'd like to revert the workaround, but I don't have the support for that. I'm told that we should instead simply not use the workaround when there's a proxy server involved. I'd rather revert the workaround for broken stuff that has since been corrected. Thiago, I'm sorry, but I don't seem to understand what you're writing reading your post (maybe I'm just not smart enough, sorry to bother you :/). I can't seem read out what can be done about this, or what's the problem with the broken router fix and the problem I'm experiencing. I guess why ever the fix for broken routers was implemented, it's still reasonable. People don't do stuff for no reason. Can't there be a way to address both? Where is the problem I experience located? Is it Qt internally, or a implementation in kio_http? I'd really love to help with what ever I can do, but you may noticed I'm not really an expert in things like this (and probably should leave this to guys who really do know better than me). I was talking about bug 162600. Brief explanation: some German company sold the most popular routers in Germany with a broken firmware. It cannot handle multiple DNS requests to the same hostname, or multiple DNS requests at all. I don't remember why. Fact was: Konqueror failed to connect. We told them, they wouldn't fix it. Our workaround was to make the applications (not the ioslaves) resolve the hostname and therefore cache it. That's why name resolution via the proxy server isn't working: the application is doing a DNS resolution, which can only work if your DNS servers work. However, since we added this workaround, Safari started having problems too, hitting the same bug. Then the company got off their rears and made a fix. Hence we no longer need the workaround. As for the benefit of a local DNS cache, I question it. The DNS cache should be global, for all applications. That's why a caching nameserver or nscd are meant for. The problem is, some (most?) Linux distributions recommend against nscd but no one fixes it. And, of course, the next solution for this (ConnMan) is facing an uphill battle against Network Manager. Regarding proxy to https sites, i have not been able to use konqueror to connect to any https sites since 4.0; a proxy is required where i work. I'm currently using the kde4.3-rc2 packages. I have the 'use same proxy for all protocols' enabled. I have tried to connect to the internet two different ways. 1) tell konqueror to connect to local running squid proxy server, which in turn changes the user-agent to firefox (since work also filters based on user-agent), then connects to work's proxy. (Perhaps this scenario is what a previous posted meant as 'nested connections') 2) tell konqueror to connect directly to work's proxy server and send a user-agent of firefox. Both have 'enabled same proxy for all protocols' enabled and both scenarios result in 'SSL negotiation failed'. Yet, both work for accessing HTTP sites. I'm pretty sure both worked with konqueror version 3.5. I wish i could provide more information about the proxy server that is in use, but i have no control over it and i'm not sure how i'd find out information on the proxy. Either way, HTTPS still does not work with proxy in KDE and it seems that other people have having this same problem. Please try this: 1) killall konqueror kio_http 2) sudo tcpdump -pni eth0 -s 1500 -w /tmp/packet.trace port $PORT_OF_HTTP_PROXY or port 53 3) open konqueror and enter: https://bugs.kde.org 4) wait for something to show up in konqueror (page or error message) 5) Ctrl+C the tcpdump process 6) attach the trace file here Also include please: - does your /etc/resolv.conf contain a valid nameserver? - run: host bugs.kde.org; do you see an error? KDE 4.2 (downloading 4.3.0 rc 2 atm.) Result of 'host bugs.kde.org': $ host bugs.kde.org bugs.kde.org has address 138.246.255.179 bugs.kde.org mail is handled by 1 bugs.kde.org. /etc/resolv.conf points to the corporate DNS server. It cannot point to an Internet DNS server. Konqueror does indeed ask for the correct proxy username and password, and seems to log in to the proxy OK. The proxy requires CONNECT by name, not IP. The error message: The requested operation could not be completed Connection to Server Refused Details of the Request: URL: https://bugs.kde.org/ Protocol: https Date and Time: Tuesday 14 July 2009 13:14 Additional Information: bugs.kde.org: Unknown error Description: The server bugs.kde.org refused to allow this computer to make a connection. Possible Causes: The server, while currently connected to the Internet, may not be configured to allow requests. The server, while currently connected to the Internet, may not be running the requested service (https). A network firewall (a device which restricts Internet requests), either protecting your network or the network of the server, may have intervened, preventing this request. Possible Solutions: Try again, either now or at a later time. Contact the administrator of the server for further assistance. Contact your appropriate computer support system, whether the system administrator, or technical support group for further assistance. Sorry, I didn't attach the packet dump, because the browser or the server lost my login cookie when I attached it. Doh! Will do again later today. Created attachment 35317 [details]
packet traces from behind a strict proxy
packet traces from behind a strict proxy resulting in konqueror, and anything using kio_http(s) (4.3rc2) to fail for all https sites. The error returned in konqueror is 'SSL negot
Initially it appears that my problem with konqueror lies in the User-Agent string, or lack thereof. The proxy i'm stuck behind filters on User-Agent strings (as explained in my comment above) and it seems that konqueror does not send the User-Agent string for https connections. Also my machine appears to be resolving the hostnames correctly. $> host bugs.kde.org bugs.kde.org has address 138.246.255.179 bugs.kde.org mail is handled by 1 bugs.kde.org. There's no User-Agent for proxy connections. Konqueror doesn't send it nor needs to. (This is all handled by Qt anyway) In any case, the proxy connection was established according to your trace. The server replied with a 200 code, indicating success. Wireshark also reports that the SSL session was initiated. However, the bkn.packet.trace.konq.https file is corrupt after the first SSL packet, so I can't tell why it didn't succeed. Can you try again? Please wait one minute after the error page appears. Also, please run a version of KDE that has debugging messages enabled. Using kdebugdialog, ascertain that areas 7107, 7027 are enabled. Then, after you get the errors, check if ~/.xsession-errors contains relevant information, like SSL negotiation debugging. Created attachment 35320 [details]
packet traces from behind a strict proxy (take 2)
I hope this trace is more useful.
Sorry, brad, your proxy server is just plainly broken. You were right, it's filtering on User-Agent. That's stupid. But if your system administrators want you to use a specific web client, I suggest you follow their recommendation. Or get them to change it. Second, and this is the brokenness, the proxy server replied with a 200 Connection Established (see packet 14 in your trace), but then instead of doing what it said it had done (establish the connection), the proxy server sent an HTTP/1.1 403 Forbidden response (packet 16). If you look in the body of the response (packet 18) you see that it's because of the User-Agent. Qt, obviously, cannot handle this brokenness. The server said it had connected. So Qt expects an SSL transmission to start after that, which it obviously didn't. And you never get to see the error message because it was placed at the wrong stage. Sorry, no fix for this. At least, not in KDE's side. The change needs to happen in Qt, if at all. And Qt will never send the Firefox user-agent. It may send just a dummy one, at most. Ok, with 4.3 RC Konqueror gets a 404 because it uses IP for CONNECT instead of name. Created attachment 35358 [details]
Packet trace of the 404.
Packet trace of the 404.
This is somewhat what i expected; that is that the proxy server is broken to some degree. I'm glad that i understand the problem (thank you Thiago for the tcpdump cmd line). Sadly i'm not in a position to make a proper fix for the proxy server or it's policies, however absurd they may be. It is notable, that this problem will surface for anyone else using KDE behind a user-agent filtering proxy. I've been using firefox, which sends the user-agent string over https for some reason, perhaps to combat this specific problem; and i guess i'll continue to use firefox. Unfortunately this proxy problem not only affects the web browser, but anything that uses KDE and kio_http. Again, i'm glad i understand the problem better, but i'm sad that the fix is to not use KDE. Now that i understand things a little better. i'll try to work around it locally, somehow. Perhaps i'll try to alter the source code so i can use KDE, of course svn is blocked too :). (the internet and my work place do not get along, and it's for no good reason too ... bogus). I also had to stop using KDE solely due to this bug. I was unable to use KDE behind the Hewlett-Packard internal web proxies. Which is sad. I had been a strong KDE advocate since 2.0. @brad: your problem is not about web browsers. Anything in your system that tries to use a proxy will be affected, including command-line tools. It can be considered a bug in Qt, so I will fix it there, by setting a User-Agent string. It will *not* be Firefox's. Since it's a Qt bug, please don't raise this again on bugs.kde.org. (I'll let you know when I commit a fix) @Berend: you're using a Squid server and the dump that you pasted is very weird. I have Squid 3.0 here to test (yours is 2.5) and it works with the traffic you pasted. I can't see any option in my config to enable the behaviour you got. In any case, your problem is still open and will be fixed. By the way, you may want to change your proxy password. @Eric: you didn't supply a packet dump, so I can't know what your problem is. I'll assume it's the same as Berend's. @brad: your problem has been fixed now in Qt. See http://qt.gitorious.org/qt/qt/commit/81049e49971a9f285f36204d6013c3172866718c. This fix will be available in Qt 4.5.3. I have set the User-Agent to "Mozilla/5.0". I hope that's enough. If it isn't, please open a bug report against Qt via http://www.qtsoftware.com/bugreport-form. @Thiago. Thank you very very much. From your previous response, i thought i might have upset you. I should have been clearer before, I was not advocating that konqueror identify itself as firefox with the user-agent header; no, i know that would wrong. Rather i think that my problem lies in the fact that kio_http (or qt) wasn't sending any user-agent at all to the proxy server when using https, and was therefor being blocked and returning an error. If my proxy server sees no user-agent header, the traffic is immediately rejected. So i think my problem is simply konqueror's (or qt's) lack of a user-agent header in the https stream. Again, thank you for your work on KDE/Qt. The block comes from squidGuard, not squid, I think. Access by IP is blocked, and redirected to an error page. I guess the redirect points to http, not https, and gets broken to 404. (it's a guess) I'm reading some squidGuard man pages. The idea is to prevent P2P stuff -- a lot of which uses HTTPS/CONNECT because that then basically discards HTTP. Bandwidth is really expensive here. I'll change the password. squidguard turns CONNECT 138.246.255.179:443 into 404 DIRECT/ (which is a 404 on http:///) squidguard turns CONNECT bugs.kde.org:443 (firefox sends hostname) to 200 DIRECT/138.246.255.179, which works. http to 1.1.2.3 (IP blocked) 302 NONE/ (redirect to error page) Dunno if it's squid or squidguard that's really the culprit here. It looks like it's allowed, but redirected incorrectly ??? I think your proxy server is also misconfigured. Try opening https://138.246.255.179/ in Firefox and see what happens. You should get an SSL certificate warning. Firefox doesn't moan about the certificate. But it doesn't connect (host unknown.) I agree that the error message for CONNECT 1.2.3.4 is broken, and needs to be fixed in the proxy. However, in firefox https://bugs.kde.org becomes CONNECT bugs.kde.org, and not CONNECT 138.246.255.179. I'll try and fix squidguard/squid. (In reply to comment #140) > FTP-over-HTTP is not working: > > konqueror(16869)/kio (Slave) KIO::Slave::createSlave: createSlave "http" for > KUrl("ftp://ftp.trolltech.com") > konqueror(16869)/kio (KIOConnection) KIO::ConnectionServer::listenForRemote: > Listening on "local:/tmp/ksocket-tmacieir/konquerorG16869.slave-socket" > klauncher(4760)/kio (KLauncher) KLauncher::requestSlave: KLauncher: launching > new slave "kio_http" with protocol= "http" args= ("http", > "local:/tmp/ksocket-tmacieir/klauncherMT4760.slave-socket", > "local:/tmp/ksocket-tmacieir/konquerorG16869.slave-socket") > kdeinit4: Got EXEC_NEW 'kio_http' from launcher. > kdeinit4: preparing to launch > klauncher(4760)/kio (KLauncher) KLauncher::processRequestReturn: "kio_http" > (pid 17867) up and running. > konqueror(16869)/kio (KIOJob) KIO::StatJobPrivate::slotRedirection: > KUrl("ftp://ftp.trolltech.com/") > konqueror(16869) KProtocolManager::slaveProtocol: slaveProtocol: "http" > ASSERT: "m_protocol == u.protocol().toLatin1()" in file > /home/tmacieir/src/kde4/KDE/kdelibs/kioslave/http/http.cpp, line 520 > kioslave: ####### CRASH ###### protocol = http pid = 17867 signal = 6 Thiago, Now talking about FTP-over-HTTP : do you intend to fix it ? Should I open a separate bug ? Thanks for your work Intend to, but I have no plans of doing it in the near future. (I'm busy and this bug doesn't affect me) This bug report has gone long enough. I think it's time to split it up into the remaining issues so that someone else can take care of them. See bug #201327 for FTP-over-HTTP problem. I can only confirm it doesn't work with an NTLM authenticating proxy, but could people test with other type of proxies (and give their results on bug #201327), so that we can narrow down the problem ? Hello world, I've committed some quite similar bug report here, about proxy authentification: https://bugs.kde.org/show_bug.cgi?id=199308 Can you please take a look on this? Provided some of you seem to understand networking better than I do... I've fixed our SquidGuard to work around the wrong-error-code-for-https problem. I can now get to https://bugs.kde.org/ Automatic proxy is broken again in 4.3.2-0ubuntu7. I'm trying to find out if it's an Ubuntu-thing. "Unable to find useful automatic proxy script" Manual proxy seems to work ok. It seems that the Software Updates - System Settings application does not honor the ${HOME}/.kde/share/config/kioslaverc:ProxyType=0 , since It keeps using the proxy that I manually set, but disabled by selecting "Connect directly to the internet". Konqueror works fine though. I'm using kde 4.3.3 ( Kubuntu 9.10) installed from deb http://ppa.launchpad.net/kubuntu-ppa/backports/ubuntu karmic main Please file a separate bug against "kpackagekit" for that issue, that's where that System Settings module comes from. My report in https://bugs.kde.org/show_bug.cgi?id=155707#c176 seems to be a duplicate of https://bugs.kde.org/show_bug.cgi?id=213141 Kmess follows KDE's proxy settings. I set up an account behind a proxy and it connects fine without me needeing to enter any proxy info, so I suppose that it takes proxy settings from KDE. kopete, on the other hand does not work with the proxy set up in KDE. If KMess works, wouldn't it be possible to learn from its implementation and make proxy settings work under KDE? Kubuntu 9.10 amd64 here. Cheers, --to My last try at office was with kde 4.4RC2. Https sites are still broken with an NTLM proxy. The only solution for me is to use firefox which works well. I've added some packet captures and other info to: https://bugs.kde.org/show_bug.cgi?id=199308 Maybe 199308 should be marked as a duplicate of this one. I too am using NTLM and can get HTTPS to work but not HTTP. # rpm -q kdebase qt kdebase-4.3.4-1.fc12.x86_64 qt-4.5.3-9.fc12.x86_64 Don't know if it is the same problem, but I cannot connect to WML over HTTPS port as I have standard WML port blocked. Kmess is still trying to connect and on console is this message: kmess(22099) MsnNotificationConnection::slotError: MSN Notification Connection error type 0 : "Proxy server not found" For me it looks like kmess is looking for a proxy, but I don't have any proxy, just standard WML port is blocked on the router. Using libmsn 4.0 2009-11-10 [r111] and 2.0.2 4.3.90 (KDE 4.3.90 (KDE 4.4 RC1)) "release 212", KDE:KDE4:Factory:Desktop / openSUSE_11.2 on Linux (x86_64) release 2.6.32-41 Would have any relation to this bug https://bugs.kde.org/show_bug.cgi?Id=186872 that kopete can not use the proxy settings at too Have troubles with kopete + proxy. Is difficult to test kde apps with proxy ?? This is probably related to http://bugs.kde.org/show_bug.cgi?id=188073 also. Basically, https through any kind of proxy, authenticated or otherwise, is very unreliable. *** Bug 216218 has been marked as a duplicate of this bug. *** I have a question here concerning proxying : - when presently you set a proxy on the server of a lan, if for some reasons the server is not available none of the clients can establish an internet connection with Konqueror ; why not switch automatically to a direct internet connection on the client after sending a warning message to confirm such choice. - I think before it was at least for Konqueror the way it was working. This could be a global KDE setting option in the proxy setting ; otherwise can this be set in a proxy script. Thanks. This is working for me now for http/https connections. KDE 4.5.1 *** Bug 199308 has been marked as a duplicate of this bug. *** Okay, let's assume that this is fixed then. Reopen if not fixed in KDE >= 4.5. (In reply to comment #190) > Okay, let's assume that this is fixed then. Reopen if not fixed in KDE >= 4.5. Does not work for me using KDE 4.5.4 for HTTPS. Guys, this is not resolved on 4.5.5 either (HTTPS only). I don't know how to reopen this... Confirm. Please try again with 4.6 -- if it still happens there, just note it here and I will reopen the bug. Still not work... KDE 4.6.0 Arch Linux i686 reopening. *** Bug 241751 has been marked as a duplicate of this bug. *** *** Bug 201985 has been marked as a duplicate of this bug. *** FYI, the issue that breaks HTTPS connections over proxy has been identified and a fix will be forth coming soon. Thanks for reporting. Those are very good news. I'd appreciate someone tell us what commit is actually fixing the problem when available, of course. Provided the "long" life of this bug a brief description on what was wrong would also satisfy my curiosity :) Regards, *** Bug 186470 has been marked as a duplicate of this bug. *** Git commit caa5ead18c00c8924f5b03d6495148e51e8f3cdf by Dawit Alemayehu. Committed on 05/03/2011 at 00:58. Pushed by adawit into branch 'KDE/4.6'. Fix the issue of HTTPS over proxy not working properly when connecting to two or more secure sites. BUG:155707 FIXED-IN:4.6.2 M +1 -1 kioslave/http/http.cpp http://commits.kde.org/kdelibs/caa5ead18c00c8924f5b03d6495148e51e8f3cdf Git commit 8a2929b2dd8b7fe166053aed51a5def3c6565844 by Dawit Alemayehu. Committed on 05/03/2011 at 01:00. Pushed by adawit into branch 'master'. Fix the issue of HTTPS over proxy not working properly when connecting to two or more secure sites. BUG:155707 FIXED-IN:4.6.2 M +1 -1 kioslave/http/http.cpp http://commits.kde.org/kdelibs/8a2929b2dd8b7fe166053aed51a5def3c6565844 @ Raul. Here is a breif explanation to satisfy your curiousty. :) Because KDE's io scheduler attempts to efficiently reuse existing idle ioslaves, the ioslave implementations have to ensure the address of the request they are attempting to fulfill matches the address of the server they might be connected to already. This is not an issue for most of the ioslaves in KDE, because they close connections as soon as they finish processing a request. That is not the case for those ioslaves like kio_http because they support persistent connections to a server to reduce connection setup and tear down overheads. So what happened before this one liner fix in kio_http was that the function that checked whether or not the current connection can be reused to fulfill the new request was making an incorrect comparison for the HTTPS over proxy case. *** Bug 177432 has been marked as a duplicate of this bug. *** Thank you very much for your work and explanation Dawit. I'm eager to test it :) |