Skip to content

Messy metrics and beyond: strategy & practice in measuring impact of research in Learning Technology

The titles of the posts in this series are getting longer and longer so it will soon be time to wrap it up. Here is where I am up to with this project:

I started with reviewing recently developments since the major report, the Metric Tide, came out in 2014. Then I reviewed progress on the various alternatives or complimentary services to traditional journal impact factors based on citations alone and this week at the ILTA Conference in Ireland I was able to compare notes with colleagues from a number of professional bodies, who also independently publish their journals, and get very useful feedback from authors and readers, too. I am really grateful to everyone who has been sending me links to further reading, including this recent post on how to the altmetrics for reward and pay negotiations.

Reflecting on this work over the past 2 months, I’ve come up with some practical ways forward:

  1. As an independent Open Access publisher, we’ll continue to enhance our publishing process through working with organisations like COPE to ensure that we have robust and up to date policies and practices for authors, reviewers and readers;
  2. We actually have a lot of data about the impact of going fully Open Access since 2012, but we could do a lot more to make this accessible and useful to our audience and demonstrate the impact of the journal;
  3. There are a number of technical developments, like getting altmetrics back up and running after our change of platforms last year, that are a priority in order to build up a more consistent picture of impact and one advantage of taking ownership of the journal ourselves is that we will no longer be dependent on changing publishers to provide or keep this data. GDPR is a factor here, too;
  4. One question to get some more advice on is how we can leverage our Open Access repository to best advantage for the journal and other similarly how we can optimise other online platforms we have;
  5. Last, I have come across many reasons for why despite its perceived dominance, the traditional impact factor, is not the best fit for research in Learning Technology. Implemented effectively and consistently, I think alternatives are going to be more influential in our messy, changing landscape of a discipline for most professionals.

In 2012 when our journal was one of the first of its kind to adopt Gold Open Access in the UK, there were few peers who really thought that was a good idea. Many watched and learnt from the reports and guides we produced, but few felt able or willing to make the jump, to adopt a different (business) model. This is a similar moment, but this time there are many pioneers we can learn from in other disciplines and also from our own expertise in Open Education and online content provision. I can see strong strategic reasons for making the best use of technology, but also many practical ones. And alongside the developing plans for the journal, there are many ways in which research can be recognised beyond citations, beyond metrics. At the ILTA I mentioned one of the initiatives that I am involved in for ALT to do just that: the inaugural Learning Technology Research Project of the Year Awards (which, incidentally, are now open for entries until 18 June 2018).

Featured image: EdTechIE18 image by M Hawksey for ALT