I use Thunderbird for RSS, too. Once got bitten by some browser extension that stopped working all of a sudden, and didn't let me export (except copy pasting each feed URL on its own, with clicks inbetween each time).
My most important/only rule for feeds is: Add nothing that updates more than once a day. Such websites are better off as bookmark in the browser.
And from my ~200 feeds only a fraction actually hit that limit. So I have a nice manageable chunk of unreads every day!
Hey mate, I've solved this problem (frequency of updates) at Lenns.io. i.e. it's an opinionated RSS reader/tracker with a few unique features (at least they are unique when combined). You can tag each Feed Source into multiple categories. Then, you set priority weights (manuallY) to both Sources and Categories. That way - the sources or topics that you are most interested in always pop up at the top of your feed. Then, to avoid the issue with one source overtaking your feed - you can set a limit on how many posts you want displayed per feed.
I hope this makes sense. And, I will appreciate if you give it a go and share you feedback as an experienced RSS user. Cheers!
I'd love a RSS reader that groups items on the same topic.
For example, let's say I have feeds from 25 news sources and the President gives a speech on Ukraine. I end up with 25 articles on it, spread across my item list; I need to process it 25 times. IME, most items are duplicated at least once - half my items are not needed.
If the feed reader would group all the Biden speech articles then I could process it once - pick the item I want to read, delete the rest. That's also more efficient because it's easier to compare the items.
Online news aggregators like Google News already do that, so the technology exists, but I haven't seen it in a feed reader. I also wonder if the LLMs could group them more accurately.
Currently, it detects and groups articles from a few hundred popular global news sites, and gradually growing to cover more sources.
We also have a feed builder/generator that turns unstructured public webpages into structured RSS feeds, to make it a flexible and unified feeds platform:
I've been using Liferea [1] for years, and while it doesn't do automated content analysis, it does allow you to group feeds into folders, then get an aggregated view of the folder as a whole. It also allows "search folders", which are saved cross-feed searches that are exposed through the same folder interface.
I use Liferea as my desktop client, but with TT-RSS running on my VPS as the underlying data source.
I used to use an RSS reader that had "smart folders" (filters) so I could group cross feeds and across servers but the author seems to be too busy to keep it going.
I bet language models could definitely help here, yeah. Perhaps something like (1) get content offeed items as they come in, (2) embed the content, (3) use those embeddings to group items. Probably not that difficult to be honest
Yep. That would be a classic sort of k-means problem. Just throw them all into a standard embedding, like the OA API embeddings, run k-means from sci-kit, then convert them into a list-of-lists: one RSS item (containing a list of title-URLs) per cluster.
The problem with this approach is determining the what k is for the k-means. But again, we could use the “elbow” technique to determine what’s the optimal k and then start grouping them together. I wonder if there are any automatic sophisticated clustering algorithms?
My most important/only rule for feeds is: Add nothing that updates more than once a day. Such websites are better off as bookmark in the browser.
And from my ~200 feeds only a fraction actually hit that limit. So I have a nice manageable chunk of unreads every day!