Public RSS Feed Instance Setup
I primarily rely on RSS feeds to cosume content online. I have subscribed to personal blogs, tech news, privacy & security updates, few youtube channels, & a handful of sub reddits. I use NetNewsWire on iPad as my RSS reader. Although I have it installed on my Mac as well but for proactive reading I prefer iPad, because of the convenience provided by touch screen for it is easy to flip through new posts with a swipe of finger.
Another reason why I don’t consume RSS from Mac is I haven’t still figured out how to sync RSS feed I have subscribed to on my iPad with NetNewsWire Mac client without involving any third party. Let me know if you know any solution to that.
What went wrong
Recently on the WWDC 2023 event, when the Apple released developer beta 17 for free I also gave it a crack and upgraded my iPadOS from 16.5 to 17. After trying it for few hours I experienced some critical bugs in the beta and wanted to downgrade, simply because I couldn’t afford bugs in the middle of my semester exams.
In a hurry, I exported the RSS feed as an .opml, saved into iCloud & confirmed overwrite the existing file. When I was finished with downgrading & imported the opml back into reader, there was nothing, the .opml was blank, The NetNewsWire had failed to export the feeds for some reason, or to put it simple “I lost all the feeds”.
Starting Fresh
Okay that definitely was not the end of the world. I started fresh and began collecting the feeds back & by now I have accumulated 70ish or so. Previously I had like 150 believe, yep I tend to balance around Dunbar’s Number to avoid overwhelming myself with infinite swipe loop.
So today, I exported the feeds from NetNewsWire as .opml, shifted to Mac & confirmed with bat if the export has been successful or not. Next, I pushed the .opml into my dotfiles repo as a backup. I also wrote a bash script to extract the URLs from the .opml to save into a .txt file.
#!/bin/bash
# check if file exists and is readable
if [ ! -r "$1" ]; then
echo "Usage: $0 <filename.opml>"
exit 1
fi
# use grep to match xmlUrl attributes and save values to file
grep -oE 'xmlUrl="[^"]+"' "$1" | sed -e 's/xmlUrl="//' -e 's/"$//' > feed_urls.txt
echo "Feed URLs extracted and saved to feed-urls.txt"
Public Instance
I required the extracted feed URLs for vore.website, a free, minimal, web based rss/atom reader released just few days back. It requires a username & password to sign up & provides a text field to add feed links & the subscribed feed is accessible at a public URL vore.website/username
.
The reader doesn’t support .json feed currently but that is not a deal breaker for me. You can check out my public RSS feed at vore.website/cosmicqbit. For now, It serves as a backup for my RSS feed & provides a convenient & minimal interface to access my feed from anywhere or any device online.
What are your thoughts on my overall setup & if you know any solution for my syncing issue, please let me know.
You can reply via mail