When fetching a feed multiple times akregator duplicates existing items when the content of a fetched item differs from the content of the same item already available locally. I'm suffering from this bug now since 10+ years and would like to see it finally gone.
Here is my theory why it happens:
Instead of using the guid only to compare two items for equality, Akregator builds a hash over title, description, content, link and author (https://github.com/KDE/akregator/blob/0d588dcbfb9cc93dec5b6bcbf3b01336ca1d09ce/src/feed/feed.cpp#L581-L585 and https://github.com/KDE/akregator/blob/0d588dcbfb9cc93dec5b6bcbf3b01336ca1d09ce/src/article.cpp#L189) and checks that as well, unless the guid started with "hash:". I believe this is not according to the specification, which states:
> guid stands for globally unique identifier. It's a string that uniquely identifies the item.
> When present, an aggregator may choose to use this string to determine if an item is new.
> There are no rules for the syntax of a guid. Aggregators must view them as a string. It's up to
> the source of the feed to establish the uniqueness of the string.
I'd be happy to provide additional information if necessary.