About this series: Through August and September we are publishing a series of posts authored by some of the team at Altmetric, a data science company who provide such attention data to authors, publishers, institutions and funders. The posts will discuss, amongst other topics, using altmetrics within your C.V.s and grant applications, and how journal editors can make use of the tools. Learn more about the series by starting with the first post here. This particular post is the final within our Altmetric series and is authored by guest blogger Catherine Williams.
Challenges in expanding coverage
Taking a look at the Altmetric 2014 Top 100 (our annual compilation that of scholarly outputs that got the most attention online) we can see that over 70% of the list is made up of articles from medicine or science.
There are other influences of this outcome too. As discussed in our blog post last week, the final outputs of many humanities and social science projects are often not journal articles, but perhaps books or other forms of content. Similarly, the way that people refer to those outputs online is very different to the way they discuss a journal article – making it more difficult to collate and correctly identify which research is being discussed (a book is more likely to cover several years worth of work, and therefore people will refer to just the author and their research as whole, rather than a specific output.)
In a talk given in London in September 2015, Historian Melodee Beals discussed how altmetrics might apply to humanities scholars. Melodee spoke of her experience of integrating altmetrics into her workflows and trying to encourage her fellow researchers to do the same. Metrics surrounding humanities content have always been harder to gather that than those for scientific outputs, with many outputs often being under-represented. Melodee spoke of the need for good impact to be ‘purposeful’ – meaning that an academic should know what impact they want their work to have, and why, and should embark on the most effective activities to achieve that impact.
She too championed consistent identifiers and a solid infrastructure – highlighting the use of an ORCID ID to help researchers be easily identified and rightly credited for their work.
Looking to the future
Altmetrics can provide valuable insight to scholars on how their work is being received and reused, no matter what their discipline of study. Providing a better indication of the amount and type of attention non-scientific research content is receiving online is a key priority for Altmetric – particularly as scholars in these disciplines increasingly look for better ways to get credit for and evidence the influence and dissemination of their work. We’re already working to introduce better altmetrics for books (collated based on their ISBN) in the near future, and along with that are consulting both our advisory board and amongst the wider community to make sure that the attention data we provide for such content is relevant and timely.
As new forms of online content develop, and humanities and social science scholars in particular find increasingly diverse forms for the research outputs to take, it’ll be crucial for those involved in scholarly communication and evaluation to ensure their methods and sources provide a fair representation, no matter what the subject.
That’s all from Team Altmetric on The Source for now – we hope you’ve enjoyed our guest posts. You can also find us over on our own blog, http://www.altmetric.com/blog/, where we and a variety of guest contributors regularly post on all things altmetrics!