Bin Laden, Wikipedia Bias and Filter Bubbles


Here are three articles that caught my attention last week: first, a piece on how you evaluate the reliability a Wikipedia entry, then a reflection on what defines citizen journalism in the light of “the man who tweeted Bin Laden” and, finally, a TED video on why web giants Google and Facebook need to beware of filter bubbles. Enjoy!

What the Wiki?! (thesiswhisperer.wordpress.com)
The Thesis Whisperer is a blog about doing a thesis and in general a great resource for people doing their PhDs. The post “What the Wiki?!” gives you practical advice on how to evaluate the reliability of an entry on Wikipedia:

Can you rely upon the information you find in a resource that has evolved by communal effort, and is not peer-reviewed in the conventional sense, with entries edited by nameless individuals of unknown reputation, and citations drawn from all manner of sources, both old and new?

Read the full article here.

Why the man who tweeted Osama bin Laden raid is a citizen journalist (poynter.org)
Steve Myers discusses the underpinnings of citizen journalism and picks up the division bloggers vs journalists once again:

I’m starting to think that professional journalists get more caught up with the “citizen journalist” label than the citizen journalists themselves. Perhaps “citizen journalists vs. journalists” is the new “bloggers vs. journalists” debate.

Read the full article here.

Eli Pariser: Beware online “filter bubbles” (ted.com)
Content curating services and “smart” newsfeeds (hello Facebook) are getting more and more permeated into your daily life. What you read/watch/listen to finds YOU as much as you find IT. Eli Pariser gives a 10 minute introduction to his concept of “filter bubbles”, a world where “a squirrel dying in front of hour house may be more relevant to your interests right now than people dying in Africa”, allegedly said my Mark Zuckerberg, CEO of Facebook. Eli Pariser says:

And the thing is that the algorithms don’t yet have the kind of embedded ethics that the editors [the human gatekeepers] did. So if algorithms are going to curate the world for us, if they’re going to decide what we get to see and what we don’t get to see, then we need to make sure that they’re not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important […] other points of view.

We recently posted an article on the same topic, The New Media Selection Logic And Transparency.

See Eli Pariser’s talk here.

Image credit: swanksalot CC:BY-SA