Bug 113358 - Dead feeds are trying to be downloaded repeaatedely
Summary: Dead feeds are trying to be downloaded repeaatedely
Status: RESOLVED FIXED
Alias: None
Product: akregator
Classification: Applications
Component: general (show other bugs)
Version: 1.1.2
Platform: unspecified Linux
: NOR normal
Target Milestone: ---
Assignee: kdepim bugs
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2005-09-26 14:54 UTC by Łukasz Derkacz
Modified: 2005-12-05 16:43 UTC (History)
0 users

See Also:
Latest Commit:
Version Fixed In:
Sentry Crash Report:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Łukasz Derkacz 2005-09-26 14:54:56 UTC
Version:           1.1.2 (using KDE 3.4.2, compiled sources)
Compiler:          gcc version 3.3.6
OS:                Linux (i686) release 2.4.31

If a feed site is blocking my IP then the feed seems to be dead (X mark). But aKregator tries to download them repeadetely what causes negatve effects like unnecesary disk operation. (I have set up dwonload time 30 mins. for all feeds).
I tried to set up refresh time to those feeds to 1 day. But it didn't helped.
Comment 1 Frank Osterfeld 2005-09-30 08:15:53 UTC
Right, akregator tries to refetch broken feeds every minute. That doesn't make much sense admittedly.
Comment 2 Frank Osterfeld 2005-12-05 16:43:25 UTC
SVN commit 485739 by osterfeld:

If an fetch error occurs (host down, parsing error), wait 30 minutes before trying again. Akregator retried
to fetch the feed every minute, which is particularily painful if Akregator fails to parse an actually valid
feed.
BUG: 113358


 M  +16 -3     feed.cpp  


--- branches/KDE/3.5/kdepim/akregator/src/feed.cpp #485738:485739
@@ -62,9 +62,14 @@
         bool markImmediatelyAsRead;
         bool useNotification;
         bool loadLinkedWebsite;
-        int lastFetched;
 
         bool fetchError;
+        
+        int lastErrorFetch; // save time of last fetch that went wrong.
+                            // != lastFetch property from the archive 
+                            // (that saves the last _successfull fetch!)
+                            // workaround for 3.5.x
+
         int fetchTries;
         bool followDiscovery;
         RSS::Loader* loader;
@@ -267,6 +272,7 @@
     d->markImmediatelyAsRead = false;
     d->useNotification = false;
     d->fetchError = false;
+    d->lastErrorFetch = 0;
     d->fetchTries = 0;
     d->loader = 0;
     d->articlesLoaded = false;
@@ -395,6 +401,14 @@
         queue->addFeed(this);
     else
     {
+        uint now = QDateTime::currentDateTime().toTime_t();
+
+        // workaround for 3.5.x: if the last fetch went wrong, try again after 30 minutes
+        // this fixes annoying behaviour of akregator, especially when the host is reachable
+        // but Akregator can't parse the feed (the host is hammered every minute then)
+        if ( fetchErrorOccurred() && now - d->lastErrorFetch <= 30*60 )
+             return;
+
         int interval = -1;
 
         if (useCustomFetchInterval() )
@@ -405,8 +419,6 @@
 
         uint lastFetch = d->archive->lastFetch();
 
-        uint now = QDateTime::currentDateTime().toTime_t();
-
         if ( interval > 0 && now - lastFetch >= (uint)interval )
             queue->addFeed(this);
     }
@@ -587,6 +599,7 @@
         else
         {
             d->fetchError = true;
+            d->lastErrorFetch = QDateTime::currentDateTime().toTime_t();
             emit fetchError(this);
         }
         return;