One of the reasons why I keep a blog is to get an ability to “join the dots”. I was doing reasonably okay by tracking around 100+ sources in Inoreader as an RSS reader.
Here’s a quick idea about what Inoreader is. It is a tool which allows you to “aggregate” the RSS feeds from any website you wish to track. RSS was the open federated protocol that has withstood the test of time. Although, the evil corporations have either relegated or messed up with its implementation to keep a “walled garden approach” to serve you advertisements. As a privacy enthusiast, I am also against pervasive behavioural tracking on the web. Therefore RSS allows me to consume content without me chasing it, rather than the other way around. Let’s say that a better way to put it is to say- all the accumulated “pearls of wisdom” just fall in your lap!
When I had first signed up for Inoreader, I had made a conscious shift from NewsBlur that did a relatively good job of managing feeds and sources. I shifted out of the service because I believed that I needed something “more”. Initially, I was quite overwhelmed with the way it factored in “features” (tagging, rules, highlights, stars, etc.). However, it took me some time to understand their philosophy of workflow and organisation.
- Organise feeds in folders. I took this idea from a long-forgotten blog post. If you have hundreds of sources, they will remain “unread”. Therefore, I created a folder for “Absolute Reads” ” News” ” Linux” etc. These folders provide a method for the organisation depending on how frequently the feeds will get updated and what importance do I need to accord to them.
- News folder defaults as a “list view”. I track hundreds of news sites that are updated in real-time. A list view would allow me to skim through those headlines. As you improve the reading, skimming the headlines would become second nature.
- Depending on the importance I accord to other sources, I can then kick in a “second-order” of reading level; one that requires me to go slow to read opinion pieces or editorials. Those are important to understand the pulse of issues and often forms a part of my blog links and my opinion/annotations around them.
- Last but not the least is the “absolute reads”- ones that fire up over a while, have the least unread articles and they require maximal attention.
This is only the first order of refinement.
The second trick is to understand how to filter the feed. Suppose you wish to track an issue in real-time, Google News integration is offered in the paid tier (edit: I am told it is available in all). However, you’d see that a lot of updates coming through are usually duplicate URL’s or content. You can then apply feed filters. They have an “and or not” boolean operations- you can specify keywords that would keep or remove the feeds and run the rules in real-time. It takes a little time to understand, but it is an incredibly powerful tool to have.
Highlights and Tags are another exciting way to scan the feeds. Assuming you just want to have a cursory glance to track a specific keyword in the river of news, if that instance occurs in the title, it would highlight. For example, I am monitoring the real-time case of a specific laptop that I wish to upgrade- I can see the price and availability as it happens. (with perhaps a delay of few seconds). Likewise, for every specific folder, you can assign tags. Here’s an interesting idea:
Tags help you group and save critical articles so that you can clear your main feeds and folders to avoid information overload, while not losing interesting articles to the past. Each inoreader tag also includes its own email address, so you can use them as an inbox for newsletters!
I’d recommend the following workflow:
- Get good sources. You’d realise that there are tons of sources which publish excellent write-ups.
- Learn to read selectively (as I have outlined above).
- Use the power of organisation in the service- rules, filters, tagging and highlights.
- Last but not least- integrations. I have automated a lot of processes using IFTTT, share articles on a rolling basis on Twitter and have set up a feedback loop to share the best resources in 24 hours. It doesn’t get better!
Here comes the kicker. Any article you wish to read later or requires more complex processing from your end, star it. This way, I can deal with the influx of articles elsewhere and read through the most important ones.
There’s something more interesting. Inoreader also allows you to go through Twitter effectively. I use the specific search on Twitter for a username, filter out replies to the username and “retweets”. I also fetch those tweets with a link alone. Usually, the author lists out the reason why they are linking to the write-up. Since the tweets are treated as “RSS links” in Inoreader, the same rules as above can apply to them as well. I can block specific keywords or remove the feeds altogether. Therefore, it provides a huge signal from the noise and minus the algorithmic mess. A lot of Twitter traffic is from bots and automated accounts, and it can become challenging to distinguish real value. You can also subscribe to specific twitter lists, if needed that adds more refinement to consuming content.
It doesn’t require technical geekiness to understand the process- all of the features are available in plain sight on the service! Its how you use them effectively drive meaning and value is essential.
I’d wrap up with what the author says about finding value in reading “too much” (emphasis mine)
Reading too much is about keeping up to date with useful information from a variety of disciplines. The value of this is that it informs better decision making at work and in your personal life. The content itself might also provide intellectual value – learning about interesting ideas, events and people throughout the world. It takes a while for the value of a good reading habit to kick in. But slowly it does, as the universe of new facts you accumulate begins to show up in your daily life.
You can choose to read for general knowledge, or to become a specialist on some topic, or to read for the sake of reading. But whatever you choose, if you really want to have a tried and tested method for extracting signal from noise, you can’t go wrong with the system I described above.