Messy metrics and beyond: strategy & practice in measuring impact of research in Learning Technology

EdTechIE18 image by M Hawksey for ALT

The titles of the posts in this series are getting longer and longer so it will soon be time to wrap it up. Here is where I am up to with this project:

I started with reviewing recently developments since the major report, the Metric Tide, came out in 2014. Then I reviewed progress on the various alternatives or complimentary services to traditional journal impact factors based on citations alone and this week at the ILTA Conference in Ireland I was able to compare notes with colleagues from a number of professional bodies, who also independently publish their journals, and get very useful feedback from authors and readers, too. I am really grateful to everyone who has been sending me links to further reading, including this recent post on how to the altmetrics for reward and pay negotiations.

Reflecting on this work over the past 2 months, I’ve come up with some practical ways forward:

  1. As an independent Open Access publisher, we’ll continue to enhance our publishing process through working with organisations like COPE to ensure that we have robust and up to date policies and practices for authors, reviewers and readers;
  2. We actually have a lot of data about the impact of going fully Open Access since 2012, but we could do a lot more to make this accessible and useful to our audience and demonstrate the impact of the journal;
  3. There are a number of technical developments, like getting altmetrics back up and running after our change of platforms last year, that are a priority in order to build up a more consistent picture of impact and one advantage of taking ownership of the journal ourselves is that we will no longer be dependent on changing publishers to provide or keep this data. GDPR is a factor here, too;
  4. One question to get some more advice on is how we can leverage our Open Access repository to best advantage for the journal and other similarly how we can optimise other online platforms we have;
  5. Last, I have come across many reasons for why despite its perceived dominance, the traditional impact factor, is not the best fit for research in Learning Technology. Implemented effectively and consistently, I think alternatives are going to be more influential in our messy, changing landscape of a discipline for most professionals.

In 2012 when our journal was one of the first of its kind to adopt Gold Open Access in the UK, there were few peers who really thought that was a good idea. Many watched and learnt from the reports and guides we produced, but few felt able or willing to make the jump, to adopt a different (business) model. This is a similar moment, but this time there are many pioneers we can learn from in other disciplines and also from our own expertise in Open Education and online content provision. I can see strong strategic reasons for making the best use of technology, but also many practical ones. And alongside the developing plans for the journal, there are many ways in which research can be recognised beyond citations, beyond metrics. At the ILTA I mentioned one of the initiatives that I am involved in for ALT to do just that: the inaugural Learning Technology Research Project of the Year Awards (which, incidentally, are now open for entries until 18 June 2018).

Featured image: EdTechIE18 image by M Hawksey for ALT

The quality of metrics matters: looking ahead to #EdTechIE18

EdTechIE18 presentation slide

I am really looking forward to taking part in ILTA’s upcoming conference, TEL Quality Matters – People, Policies and Practices, 31 May – 1 June at IT Carlow. You can see the full programme here http://programme.exordo.com/edtech2018/ .

Working with ALT’s new strategic working group for the development of the Open Access journal Research in Learning Technology, I have been working on understanding more about alternatives to the established Impact Factor for independent Open Access journals generally and more specifically for researchers in Learning Technology (revisit my first and second post on this subject) and next week’s conference is a valuable opportunity for me to meet some of the colleagues who are part of the group and find out more about their experiences.

Meanwhile I have come across two new interesting articles (thanks to Neil Morris for making me aware of the blog post).

The first is another interesting post from the LSE Impact BlogThe academic papers researchers regard as significant are not those that are highly cited . The authors describe the current perspective as follows:

Citations, JIF, and h-index have served as the triumvirate of impact evaluation for many years, particularly in STEM fields, where journal articles are frequently published. Many studies have pointed out various flaws with reliance on these metrics, and over time, a plethora of complementary citation-based metrics have been created to try and address various proficiencies. At the same time, we see altmetrics emerging as a potential alternative or complement to citations, where we can collect different data about the ways in which research is viewed, saved, and shared online.

The authors share useful insights from their work surveying chemistry researchers to “gauge their perceptions of significance, importance, and highly cited materials. The results, while not truly startling, were nevertheless a stark illustration of how different these concepts are.”

The post ends with reflecting on how meaningful assessment of research can be developed within individual disciplines and I think this is a useful approach for Learning Technology in particular, given the diversity of research and research active professionals as well as the broad range of media and technologies we utilise.

Another interesting post I have come across is this from the Scholarly Kitchen which examines the role of preprint repositories and their impact on journal citation rates https://scholarlykitchen.sspnet.org/2018/05/21/journals-lose-citations-preprint-servers-repositories/ . The article in itself is interesting, but in particular I found it helpful to look at how some of the practices in Open Access publishing have developed. What I am particularly interested in is the messiness that results in different versions of the same research being linked to and cited on different platforms and how this impacts on citation rates and access to research over time.

One of the questions in the post is why people keep linking to and citing pre-prints even when the published version of an article is available. There is no tidy way of making everything link up in one place and so whichever platform is used most becomes the most useful even though the life cycle of publication is designed with a different aim in mind. The post uses the example of bioRxiv to examine how this, but this can easily be applied to other, similar platforms formal and informal.

The post concludes with this:

A citation is much more than a directional link to the source of a document. It is the basis for a system of rewarding those who make significant contributions to public science. Redirecting citations to preprint servers not only harms journals, which lose public recognition for publishing important work, but to the authors themselves, who may find it difficult to aggregate public acknowledgements to their work.

I am looking forward to exploring these and related questions next week. You’ll be able to access all details of my presentation via the conference website http://programme.exordo.com/edtech2018/delegates/presentation/1/ . I’d also like to add a note of thanks to the wonderful Bryan Mathers for making his Visual Thinkery available under Creative Commons licences and thus enabling me to use (and credit) them in my presentation. Thank you.