On alternative impact factors and "filtering after the fact"


The practice of web publishing (in the forms of blogs, microblogs etc) is characterized by a “publish, then filter” model, meaning that whoever can publish whatever without asking a publisher for permission or having to go through a selection process.

In a recent article by The Times Higher Education, they report from a debate at the British Library where academics and students agreed that “researchers had not embraced new technology to share their data and findings”. David Gauntlett, professor of media and communications at the University of Westminster, talked about the “publish, then filter” model in an academic context and in the comment section of the article, Gauntlett clarifies his stance:

Just to flesh out, very briefly, the ‘publish, then filter’ model for research which I was talking about (as reported here):

The idea is that first of all, academics would (still) produce work to a standard that they think is publishable – they should be satisfied with it as a representation of their ideas and research.

Then they would post it online. The quality filtering then takes place post-publication; good and useful research will first of all be shared online more, by people who find it interesting and fruitful; and at a later stage we will also see that it has been cited more (a traditional measure to keep the traditionalists happy).

This is surely a better way to identify quality research than by the accepted method of asking two anonymous academics, who are bound to have a preference for themselves and their kind of work, rather than other people and other kinds of work, if it should be published or not. My proposal also avoids strangling unusual or non-traditional work before it has even seen the light of day.

A discussion between Gauntlett and a character called “Bee” continues:

[Bee:] David: The reality is, most publications would never be read. That what would be read would be the stuff by people who already have some reputation. It is just not true that “good and useful research would be shared more” what is true is that popular research would be shared more, and it might be popular for many other reasons than being good and useful. Indeed, it might be popular because it’s complete nonsense. More seriously, it might be popular because the person writing it is popular.

[David Gauntlett:] Yes, I agree, you’d need some mechanisms to help highlight good work by the currently-unknowns. But that’s the kind of thing that the online world is relatively good at, especially compared with the world of academic journals.

You’re quite right that we’d need to focus on that — indeed, it’s one of the ways that the established journals and portals could continue to justify their existence, by trying to pick out good work by people you might otherwise not have heard of.

But it’s not an insurmountable problem.

Read the full article: Pressure to publish papers blamed for reluctance to share digital data.

Alt-metrics: a manifesto (altmetrics.org)
Gauntlett talks about “mechanisms to help highlight good work by the currently-unknowns” which brings me to the work carried out by (among others) Jason Priem, PhD student in Information and Library Science on the project Alt-metrics which is about tracking scholarly impact on the social web. Preem and his colleagues wants to track impact outside the academy, impact of influential but uncited work, and impact from sources that aren’t peer-reviewed. Doing so can also result in improved filtering systems:

The speed of alt-metrics presents the opportunity to create real-time recommendation and collaborative filtering systems: instead of subscribing to dozens of tables-of-contents, a researcher could get a feed of this week’s most significant work in her field. This becomes especially powerful when combined with quick “alt-publications” like blogs or preprint servers, shrinking the communication cycle from years to weeks or days. Faster, broader impact metrics could also play a role in funding and promotion decisions.

Read the Alt-metrics manifesto here.

Related readings:
Journal of X and Z, a MEDEA project dealing with the future of scholarly communication and academic publishing. This post has been cross-posted to the project website at wpmu.mah.se/medea-journal/.

Image credit: –Tico– CC:BY-NC-ND