Messy metrics and beyond: strategy & practice in measuring impact of research in Learning Technology

EdTechIE18 image by M Hawksey for ALT

The titles of the posts in this series are getting longer and longer so it will soon be time to wrap it up. Here is where I am up to with this project:

I started with reviewing recently developments since the major report, the Metric Tide, came out in 2014. Then I reviewed progress on the various alternatives or complimentary services to traditional journal impact factors based on citations alone and this week at the ILTA Conference in Ireland I was able to compare notes with colleagues from a number of professional bodies, who also independently publish their journals, and get very useful feedback from authors and readers, too. I am really grateful to everyone who has been sending me links to further reading, including this recent post on how to the altmetrics for reward and pay negotiations.

Reflecting on this work over the past 2 months, I’ve come up with some practical ways forward:

  1. As an independent Open Access publisher, we’ll continue to enhance our publishing process through working with organisations like COPE to ensure that we have robust and up to date policies and practices for authors, reviewers and readers;
  2. We actually have a lot of data about the impact of going fully Open Access since 2012, but we could do a lot more to make this accessible and useful to our audience and demonstrate the impact of the journal;
  3. There are a number of technical developments, like getting altmetrics back up and running after our change of platforms last year, that are a priority in order to build up a more consistent picture of impact and one advantage of taking ownership of the journal ourselves is that we will no longer be dependent on changing publishers to provide or keep this data. GDPR is a factor here, too;
  4. One question to get some more advice on is how we can leverage our Open Access repository to best advantage for the journal and other similarly how we can optimise other online platforms we have;
  5. Last, I have come across many reasons for why despite its perceived dominance, the traditional impact factor, is not the best fit for research in Learning Technology. Implemented effectively and consistently, I think alternatives are going to be more influential in our messy, changing landscape of a discipline for most professionals.

In 2012 when our journal was one of the first of its kind to adopt Gold Open Access in the UK, there were few peers who really thought that was a good idea. Many watched and learnt from the reports and guides we produced, but few felt able or willing to make the jump, to adopt a different (business) model. This is a similar moment, but this time there are many pioneers we can learn from in other disciplines and also from our own expertise in Open Education and online content provision. I can see strong strategic reasons for making the best use of technology, but also many practical ones. And alongside the developing plans for the journal, there are many ways in which research can be recognised beyond citations, beyond metrics. At the ILTA I mentioned one of the initiatives that I am involved in for ALT to do just that: the inaugural Learning Technology Research Project of the Year Awards (which, incidentally, are now open for entries until 18 June 2018).

Featured image: EdTechIE18 image by M Hawksey for ALT

The quality of metrics matters: looking ahead to #EdTechIE18

EdTechIE18 presentation slide

I am really looking forward to taking part in ILTA’s upcoming conference, TEL Quality Matters – People, Policies and Practices, 31 May – 1 June at IT Carlow. You can see the full programme here http://programme.exordo.com/edtech2018/ .

Working with ALT’s new strategic working group for the development of the Open Access journal Research in Learning Technology, I have been working on understanding more about alternatives to the established Impact Factor for independent Open Access journals generally and more specifically for researchers in Learning Technology (revisit my first and second post on this subject) and next week’s conference is a valuable opportunity for me to meet some of the colleagues who are part of the group and find out more about their experiences.

Meanwhile I have come across two new interesting articles (thanks to Neil Morris for making me aware of the blog post).

The first is another interesting post from the LSE Impact BlogThe academic papers researchers regard as significant are not those that are highly cited . The authors describe the current perspective as follows:

Citations, JIF, and h-index have served as the triumvirate of impact evaluation for many years, particularly in STEM fields, where journal articles are frequently published. Many studies have pointed out various flaws with reliance on these metrics, and over time, a plethora of complementary citation-based metrics have been created to try and address various proficiencies. At the same time, we see altmetrics emerging as a potential alternative or complement to citations, where we can collect different data about the ways in which research is viewed, saved, and shared online.

The authors share useful insights from their work surveying chemistry researchers to “gauge their perceptions of significance, importance, and highly cited materials. The results, while not truly startling, were nevertheless a stark illustration of how different these concepts are.”

The post ends with reflecting on how meaningful assessment of research can be developed within individual disciplines and I think this is a useful approach for Learning Technology in particular, given the diversity of research and research active professionals as well as the broad range of media and technologies we utilise.

Another interesting post I have come across is this from the Scholarly Kitchen which examines the role of preprint repositories and their impact on journal citation rates https://scholarlykitchen.sspnet.org/2018/05/21/journals-lose-citations-preprint-servers-repositories/ . The article in itself is interesting, but in particular I found it helpful to look at how some of the practices in Open Access publishing have developed. What I am particularly interested in is the messiness that results in different versions of the same research being linked to and cited on different platforms and how this impacts on citation rates and access to research over time.

One of the questions in the post is why people keep linking to and citing pre-prints even when the published version of an article is available. There is no tidy way of making everything link up in one place and so whichever platform is used most becomes the most useful even though the life cycle of publication is designed with a different aim in mind. The post uses the example of bioRxiv to examine how this, but this can easily be applied to other, similar platforms formal and informal.

The post concludes with this:

A citation is much more than a directional link to the source of a document. It is the basis for a system of rewarding those who make significant contributions to public science. Redirecting citations to preprint servers not only harms journals, which lose public recognition for publishing important work, but to the authors themselves, who may find it difficult to aggregate public acknowledgements to their work.

I am looking forward to exploring these and related questions next week. You’ll be able to access all details of my presentation via the conference website http://programme.exordo.com/edtech2018/delegates/presentation/1/ . I’d also like to add a note of thanks to the wonderful Bryan Mathers for making his Visual Thinkery available under Creative Commons licences and thus enabling me to use (and credit) them in my presentation. Thank you.

Which direction to take… researching alternative ways of measuring impact in Learning Technology

This is the second post about my current work on researching alternative ways of measuring impact in Learning Technology. Go back to the first post in which I have set out the context of my work and what I am particularly focused on.

Alongside the practical work with the ALT Journal Strategic Working Group, I am pleased that my proposal of a short session The quality of metrics matters: how we measure the impact of research in Learning Technology’ has been accepted for ILTA’s Annual Conference in Carlow, Ireland later this month. 

In the meantime, I have been doing more reading and research into innovative ways of measuring impact and this time my work has come up against some very practical questions, not least because as a UK-based publisher we are in the process of ensuring the the journal’s operations comply with the incoming GDPR legislation. Open Source journal systems are not at the forefront of compliance and like other independent publishers we work as part of the community to move towards compliance.

At first glance factors like GDPR may not seem to be closely related to how impact is measured, but my thinking links them closely as a lot of the opportunities around developing the journal are dependent on technical solutions that have data processing implications:

A convincing alternative
Discussing how important having an impact factor is quickly runs into the question of what the alternative looks like. As well as the technical challenges in implementing innovative tools or mechanism for measuring impact (to which the new GDPR legislation adds another level of complexity), the sustainability and longevity of both tool and data storage need to be examined. For example, introducing a tool like Altmetrics requires us to educate all stakeholders and ensure that the level of digital literacy required is not a barrier to making the tool useful. The user interface and experience needs to be robust and practical, building confidence in alternative or innovative ways of measuring impact. With new tools and platforms being created all the time there is a certain amount of churn and in order to really build a convincing alternative there needs to be a certain level of consistency.

Scrutiny of new vs. established ways of measuring impact
The kind of scrutiny with which we are examining alternative ways of measuring impact isn’t easily applied to the established method. There is a critical discourse, for example this recent blog post on the LSE impact blog, which argues:

Many research evaluation systems continue to take a narrow view of excellence, judging the value of work based on the journal in which it is published. Recent research by Diego ChavarroIsmael Ràfols and colleagues shows how such systems underestimate and prove detrimental to the production of research relevant to important social, economic, and environmental issues. These systems also reflect the biases of journal citation databases which focus heavily on English-language research from the USA and north and western Europe. Moreover, topics covered by these databases often relate to the interests of industrial stakeholders rather than those of local communities. More inclusive research assessments are needed to overcome the ongoing marginalisation of some peoples, languages, and disciplines and promote engagement rather than elitism.

It’s really helpful to read this kind of perspective, but in my experience there is a strong sense that institutions and senior management place much importance on the established value of the impact factor. We have decided to carry out consultation with stakeholders, but in the absence of a convincing alternative (which in our case we simply haven’t had time to implement as yet) I am not sure what we would be asking our stakeholders to compare or comment on. There is such a range of options being implemented by Open Access publishers, that we can a learn a lot from their example and work towards putting in place improvements that will help establish what might be an alternative or a complimentary perspective to the traditional impact factor.

Measuring beyond impact: peer review
Through our Editorial Board, the working group has now also begun to look at platforms like Publons, which promises to ‘integrate into the reviewer workflow so academics can track and verify every review and editorial contribution on the fly and in complete compliance with journal review policies’ (read more). It’s clearly a widely-used platform and some colleagues seem to be enthusiastic users, so it’s made me consider what this kind of platform could add to the user experience alongside innovative tools to measure impact. As a journal that does not charge any APCs, the value proposition for authors is clear, but resources to improve the experience of reviewers are limited. More work is needed for us in this area to examine whether we can compliment our efforts to improve the ways in which the impact is measuring could be complimented by enhancing the experience of peer review.

 

Read more (with thanks to everyone who’s sent me comments or links):

Information for publishers from DOAJ: 
DOAJ does not believe in the value of impact factors, does not condone their use on journal web sites, does not recognise partial impact factors, and advocates any official, alternative measure of use, such as article level metrics.

There is only one official, universally recognised impact factor that is generated by Thomson Reuters; it is a proprietary measure run by a profit-making organisation. This runs against the ethics and principles of open access and DOAJ is impact-factor agnostic. DOAJ does not collect metadata on impact factors. Displaying impact factors on a home page is strongly discouraged and DOAJ perceives this as an attempt to lure authors in a dishonest way.

Full information here.

Researching alternative ways of measuring impact in Learning Technology

New altmetric donut from https://www.altmetric.com/blog/the-next-generation-of-altmetric-donut-is-here/

Last year I worked on finding a sustainable new home for the Open Access journal Research in Learning Technology. As part of my work for ALT, this was the third transition I have worked on since 2008 and during this period I have contributed to the thinking around Open Access publishing in Learning Technology, often through ALT’s contribution to initiatives such as the 2012/3 ‘Gold Open Access Project‘. This year I will be working with a new group set up by ALT to steer the future development of the journal:

A new Strategic Journal Working Group to help steer the development of the journal now being published by ALT in partnership with Open Academia has been established and we are grateful that representatives from other scholarly bodies who are publishing in a similar model, have agreed to join the group to share best practice and support each other. The group is chaired by Prof Neil Morris, who also chairs the Editorial Board, and we are delighted to welcome colleagues from ascilite, ILTA and the OLC alongside our Editors.

As well as learning from each other, the group is going to be examining alternative ways of measuring impact (alternative to the established impact factor, which the journal has not been awarded to date). This is an area I am particularly interested in for three reasons:

Knowledge exchange happens elsewhere
Firstly, much of the most cutting edge research and practice in Learning Technology is not published in formal journals. Even the most responsive Open Access peer-review system can’t necessarily keep pace with the quickly changing technology landscape we work in and so less formal ways of knowledge exchange on blogs, on social media or in project reports is often more important and useful.

Different media
Secondly, a lot of the most interesting ideas may be shared as videos, drawings, data visualisations and so on, in short, they may not easily fit into the traditional formats measures like an impact factor were designed for. What we cite and where we link to can be harder to track. As we use new technologies to communicate and share information the way in which we cite/link to sources needs to adapt.

Crossing boundaries
Another aspect of what makes measuring impact interesting in Learning Technology is the way we cross boundaries of disciplines in research, policy and practice. Coming from a discipline like Anthropology, which has a hugely broad frame of reference depending on what you specialise in, it still seems challenging to what extent the work of Learning Technologists crosses boundaries.

So, keeping all this in mind, here is where I am in my work to research alternative ways of measuring impact…

I started with a a blog post DOAJ LAUNCHES THE DOAJ BEST PRACTICE GUIDE which I came across as Research in Learning Technology was recently awarded the DOAJ best practice seal. It’s a useful new guide that provides a lot of helpful information to publishers, authors and policy makers interested in Open Access publishing. One of the resources it referred me to was a tool for authors called ThinkCheckSubmit. Whilst not specifically talking about how the impact of the journal is measured, it does ask authors to check the publisher’s information for example how the journal is indexed or whether the publisher is a member of OASPA or COPE.

Also in a blog post, this time on the altmetric website, I discovered that “the next-generation of Altmetric donut is here!“. If you are new to altmetrics, here is how they explain what it’s all about:

Altmetrics are metrics and qualitative data that are complementary to traditional, citation-based metrics. They can include (but are not limited to) peer reviews on Faculty of 1000, citations on Wikipedia and in public policy documents, discussions on research blogs, mainstream media coverage, bookmarks on reference managers like Mendeley, and mentions on social networks such as Twitter.

Sourced from the Web, altmetrics can tell you a lot about how often journal articles and other scholarly outputs like datasets are discussed and used around the world. For that reason, altmetrics have been incorporated into researchers’ websites, institutional repositories, journal websites, and more.

Whilst I have been familiar with altmetrics for some time, I hadn’t actually come across the history in much detail and I found it really helpful to visit http://altmetrics.org/manifesto/ and read up on some of the older posts. It gave a me a better insight into the thinking that informed the development of the tools and policies involved. It also reminded me of the 2014/5 HEFCE publication called “The Metric Tide” which includes an executive summary, literature review and correlation analysis. As part of the recommendations the report featured, it states:

These recommendations are underpinned by the notion of ‘responsible metrics’ as a way of framing appropriate uses of quantitative indicators in the governance, management and assessment of research. Responsible metrics can be understood in terms of the following dimensions:

  • Robustness: basing metrics on the best possible data in terms of accuracy and scope
  • Humility: recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment
  • Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results
  • Diversity: accounting for variation by field, and using a range of indicators to reflect and support a plurality of research and researcher career paths across the system
  • Reflexivity: recognising and anticipating the systemic and potential effects of indicators, and updating them in response.

The recommendations outlined in the report apply mostly to HEIs, funders and government bodies. There are some however that are directly aimed at publishers. These are:

Publishers should reduce emphasis on journal impact factors as a promotional tool, and only use them in the context of a variety of journal-based metrics that provide a richer view of performance. As suggested by DORA, this broader indicator set could include 5-year impact factor, EigenFactor, SCImago, editorial and publication times. Publishers, with the aid of Committee on Publication Ethics (COPE), should encourage responsible authorship practices and the provision of more detailed information about the specific contributions of each author. Publishers should also make available a range of article-level metrics to encourage a shift toward assessment based on the academic quality of an article rather than JIFs. (Publishers)

Publishers should mandate ORCID iDs and ISNIs and funder grant references for article submission, and retain this metadata throughout the publication lifecycle. This will facilitate exchange of information on research activity, and help deliver data and metrics at minimal burden to researchers and administrators. (Publishers and data providers)

Interestingly there are a number of recommendations for HEFCE and future REF exercises, that as far as I can tell seem to have not necessarily been picked up given the recent closure of HEFCE. Still, it is useful to revisit this report and its recommendations within the wider context of thinking about alternative ways of measuring impact.

I also came across an ebook that is new to me, published by altmetrics and Scholastica, entitled “The Evolution of Impact Indicators”.  The whole publications looks extremely useful and has a lot of references that are relevant to my work, but the chapter that I am particularly interested in is called “Beyond the Impact Factor”. It discusses a number of alternatives to the impact factor including the EigenFactor and the H-Index. The H-index is probably the one I am most familiar with, but it’s also useful to remind myself of how this is tracked:

Google Scholar: Google Scholar provides the h index for authors who have created a profile.

Publish or Perish: Publish or Perish is a software program that retrieves and analyzes academic citations from Google Scholar and provides the h index among other metrics. Publish or Perish is handy for obtaining the h index for authors who do not have a Google Scholar profile.

Scopus: Scopus provides a Citation Tracker feature that allows for generation of a Citation Overview chart to generate a h index for publications and citations from 1970 to current. The feature also allows for removal of self-citations from the overall citation counts.

Web of Science: Web of Science allows for generation of the h index for publications and citations from 1970 to current using the “Create Citation Report” feature.

Now that I have started to refresh my memory of some recent developments, my next step will be to take this back to my desk, do some work on the journal itself and compare notes with my colleagues from the other publishers.

Read more:

Cemeteries of the web: parallels between Victorian burial culture and digital infrastructure

Image of cemetery

For over ten years I’ve been working in Learning Technology, but before then I spent five years doing research as an Anthropologist. I wrote a thesis about cemeteries and more specifically about the contested nature of cemeteries as cultural and material spaces. I often get asked what the link is between my work in Anthropology and Learning Technology and for me there are many. One of the strongest is that in both cases what I am most interested in is how we deal with change – and what’s left behind.

I’ve also been catching up on a year’s worth of The Contrafabulists podcasts and episode 18, recorded 14 August 2016, deals with questions around permanency online, ownership of domains and digital infrastructure – our control or lack thereof over these issues and so forth (it’s a great podcast series by Audrey Watters and Kin Lane so if you haven’t listened to it, I think you should).

Whilst listening it struck me that there are interesting parallels between what I studied and what this episode of the podcast was about, between Victorian burial culture and digital infrastructure. Here are some examples:

The illusion of permanency: one commonality for example is that a lot of digital infrastructure gives a promise of permanency in order to secure our engagement and content and Victorian entrepreneurs created urban cemeteries with the same promise. In the digital realm your posts, pictures or updates remain in place while their are valuable to the platform, but can disappear or become inaccessible with no or little notice. The newly created burial space in Victorian cities would similarly be described as a place for eternity, not just safeguarding bodily remains, but securing status and remembrance for future generations. And like its digital counterpart, cemeteries, too, could disappear for building projects or urban development with gravestones stacked unceremoniously against a wall or used as paving material.

Ineffective legal/governance frameworks: another commonality and a key issue common to both is lack of an effective legal/governance framework. For example platform user agreements that are too complex to understand or difficult to enforce – but in particular frameworks that do not take into account what happens when things change, what happens beyond the current profit predictions. Like the commercial cemetery companies in Victorian London, tech companies often operate and grow on the basis of quickly realised profits. Not many plan for the long term.

Perpetuating inequality: similar to the way in which digital infrastructure helps shape and control the actions of and narrative around our lives, Victorian cemeteries were design to do the same. Through their architecture, which echoed classical eras, through their layout, which privileged the wealthy and powerful, to the burial culture, which assigned places to men, women and children according to their status and station as well as religion, and even extending to the landscape and natural elements like plants and views, every element of the space was designed to construct a narrative of power. The history that Victorian burial culture records is the history of the ruling class.

There are many other examples, but what is most pertinent for me is that at the height of their popularity, Victorian cemeteries and the burial culture they embodied seemed unassailable, completely dominant. They had a deep impact on contemporary culture and development. What they celebrated and assigned value to was shaped by but also influenced Victorian society and culture in turn, spreading far beyond London and even England’s physical borders across the world throughout the British Empire. There was no notion that not even a century later very little of this culture would endure. Today, many of the most political, most powerful spaces of Victorian burial culture have become nature reserves, tourist attractions or slowly decaying urban wastelands.

Similarly, parts of our digital infrastructure can seem so dominant, to ubiquitous that it is hard to imagine what’s beyond them. In both cases we have limited control over what matters to us and enforcing it comes with compromises. In general digital platforms  operate on the premise that we either ignore or accept an uncertain future or otherwise make our own provision to whatever extent that we can – by securing our own domains and data.

Back from #altc2013

Tweet
https://twitter.com/A_L_T/status/378305640440938496

Back from the ALT annual conference, this year celebrating 20 years of ALT and catching up with all the things I missed over the past three days. In addition to all the blog posts and tweets, one news items that caught my eye this morning is the Technology in FE and Skills supplement published today by FE Week. There is a short interview with me in it and a lot of interesting features with participants from across the conference, including the Learning Technologist of the Year Award #ltaward. I will also have a look at the open online platform to watch some of the already aired YouTube interviews from the live broadcast. Looking forward to next year in Warwick!

The cemeteryscapes archives

For the past two years I have edited and compiled the cemteryscapes blog together with many contributors who kindly sent us their pictures, links and articles. The blog started as a community project during the last year of my PhD and then gained a modest, but loyal community of readers across the world. We featured cemeteryscapes from many countries and material culture from Africa, the United States, Skandinavia, Southern Europe and the UK – everything from coffin exhibitions to boneywards, from conservation to natural burial practices. Now the blog has been turned into an archive which will continue to be available online and as a PDF and I hope that it will continue to be interesting and relevant to the active community of cemetery researchers which I have had the pleasure to be a part of.