I reminisced about the wonderfully naive but exciting Web-period of 1993-1994. This introduced the server-log analysis to us for the first time, and hits-on-a-web-page. One of our first attempts at crowd-sourcing and analysis was to run an electronic conference in heterocyclic chemistry and to look at how the attendees visited the individual posters and presentations by analysing the server logs.
You can read all about that analysis here. This is one interesting graphic below, showing the 24-hour distribution of accesses. Remember, this was before Google and its analytics even existed (and yes, we were also doing Google-like searches before they did).
But let me get to the actual point of this post. A decade or so ago, all universities in the UK were asked to undertake a quality review exercise of their research outputs. One of the metrics of such outputs is the scientific publication, and each research group leader had to collect their most important four articles published in the previous few years and submit them (as paper) to a review panel. This poor panel was faced with a mountain of paperwork (literally!) when they arrived to do their job. It was soon decided that a better (electronic) system had to be devised. So now we have a product called Symplectic (which as it happens originated in the physics department here at Imperial College), which tirelessly gathers such outputs. More accurately, it gathers the meta-data for research publications, since most publishers do not allow actual reprints to be so harvested! And when it finds a new article, it informs its author, and asks them to check that the meta-data is accurate.
So it was a few days ago that I received such an alert. I checked the meta-data (adding in fact some which associates the scientific work with a particular resource, our High-Performance-Computing unit, and also the NMR systems here) but then the following thumbnail‡ caught my eye. The wonderful Symplectic system had computed this for me.
This I had to see. Expanded, it shows as follows. An altmetric measures attention. And attention (however transient) is apparently itself measured by tweets, facebook, news outlets, science blogs, Mendeley and CiteULike.
Well, things have certainly moved on from the days of analysing server-logs! Now, would an aspiring tenure-track young scientist, presenting an altmetric score of 28 to their head of department expect to get their tenure on this basis? Of course, we are back to the old hoary chestnut. Is attention necessarily good? You cannot tell from the above if we have indeed produced worthy science, or science to be scorned.
Well, the above represents a 20 year period in the evolution of science and how it is communicated. Whether this represents positive progress I leave you to decide. And if one of your altmetric scores is > 28, you have done better than us!
‡Does the icon look familiar? See here.