Which direction to take… researching alternative ways of measuring impact in Learning Technology

This is the second post about my current work on researching alternative ways of measuring impact in Learning Technology. Go back to the first post in which I have set out the context of my work and what I am particularly focused on.

Alongside the practical work with the ALT Journal Strategic Working Group, I am pleased that my proposal of a short session The quality of metrics matters: how we measure the impact of research in Learning Technology’ has been accepted for ILTA’s Annual Conference in Carlow, Ireland later this month. 

In the meantime, I have been doing more reading and research into innovative ways of measuring impact and this time my work has come up against some very practical questions, not least because as a UK-based publisher we are in the process of ensuring the the journal’s operations comply with the incoming GDPR legislation. Open Source journal systems are not at the forefront of compliance and like other independent publishers we work as part of the community to move towards compliance.

At first glance factors like GDPR may not seem to be closely related to how impact is measured, but my thinking links them closely as a lot of the opportunities around developing the journal are dependent on technical solutions that have data processing implications:

A convincing alternative
Discussing how important having an impact factor is quickly runs into the question of what the alternative looks like. As well as the technical challenges in implementing innovative tools or mechanism for measuring impact (to which the new GDPR legislation adds another level of complexity), the sustainability and longevity of both tool and data storage need to be examined. For example, introducing a tool like Altmetrics requires us to educate all stakeholders and ensure that the level of digital literacy required is not a barrier to making the tool useful. The user interface and experience needs to be robust and practical, building confidence in alternative or innovative ways of measuring impact. With new tools and platforms being created all the time there is a certain amount of churn and in order to really build a convincing alternative there needs to be a certain level of consistency.

Scrutiny of new vs. established ways of measuring impact
The kind of scrutiny with which we are examining alternative ways of measuring impact isn’t easily applied to the established method. There is a critical discourse, for example this recent blog post on the LSE impact blog, which argues:

Many research evaluation systems continue to take a narrow view of excellence, judging the value of work based on the journal in which it is published. Recent research by Diego ChavarroIsmael Ràfols and colleagues shows how such systems underestimate and prove detrimental to the production of research relevant to important social, economic, and environmental issues. These systems also reflect the biases of journal citation databases which focus heavily on English-language research from the USA and north and western Europe. Moreover, topics covered by these databases often relate to the interests of industrial stakeholders rather than those of local communities. More inclusive research assessments are needed to overcome the ongoing marginalisation of some peoples, languages, and disciplines and promote engagement rather than elitism.

It’s really helpful to read this kind of perspective, but in my experience there is a strong sense that institutions and senior management place much importance on the established value of the impact factor. We have decided to carry out consultation with stakeholders, but in the absence of a convincing alternative (which in our case we simply haven’t had time to implement as yet) I am not sure what we would be asking our stakeholders to compare or comment on. There is such a range of options being implemented by Open Access publishers, that we can a learn a lot from their example and work towards putting in place improvements that will help establish what might be an alternative or a complimentary perspective to the traditional impact factor.

Measuring beyond impact: peer review
Through our Editorial Board, the working group has now also begun to look at platforms like Publons, which promises to ‘integrate into the reviewer workflow so academics can track and verify every review and editorial contribution on the fly and in complete compliance with journal review policies’ (read more). It’s clearly a widely-used platform and some colleagues seem to be enthusiastic users, so it’s made me consider what this kind of platform could add to the user experience alongside innovative tools to measure impact. As a journal that does not charge any APCs, the value proposition for authors is clear, but resources to improve the experience of reviewers are limited. More work is needed for us in this area to examine whether we can compliment our efforts to improve the ways in which the impact is measuring could be complimented by enhancing the experience of peer review.

 

Read more (with thanks to everyone who’s sent me comments or links):

Information for publishers from DOAJ: 
DOAJ does not believe in the value of impact factors, does not condone their use on journal web sites, does not recognise partial impact factors, and advocates any official, alternative measure of use, such as article level metrics.

There is only one official, universally recognised impact factor that is generated by Thomson Reuters; it is a proprietary measure run by a profit-making organisation. This runs against the ethics and principles of open access and DOAJ is impact-factor agnostic. DOAJ does not collect metadata on impact factors. Displaying impact factors on a home page is strongly discouraged and DOAJ perceives this as an attempt to lure authors in a dishonest way.

Full information here.

Re-post #altc: My Chief Executive Officer’s Report, May 2018

This is my report to Members of ALT for May 2018, originally published on the #altc blog here.

Dear Members

As I am writing this we are just beginning a particularly busy period for the Association, so my report to you this time will be a whistle-stop tour of what’s happening across our community. I am pleased in particular to welcome new member organisations who have recently joined ALT: Staffordshire University, Ajenta, MyKnowledgeMap, TES, Northern Regional College, University of Dundee and Wrexham Glyndwr University.

A global perspective on professionalisation in Learning Technology

On 3 May 2018 I was honoured to join the BOLT project organisers and partners at the Hong Kong Polytechnic University at their blended learning Symposium @PolyU,  a celebration and culmination of the BOLT project. As this 4-year University Grants Committee-funded project draws to a close, the Symposium celebrated at its impact so far – evidenced by its shortlisting for the Reimagine Education 2017 awards – and also looked to the future and its sustainable legacy. A particular highlight for me was being invited to present two members of staff, Seth Neeley and Arinna Nga Ying Lee, with their CMALT Certificates.

Congratulations to Seth, Arinna and the 20 other individuals who have achieved CMALT accreditation so far this year.  

Launching a new Award for the Learning Technology Research Project of the Year

One of our strategic priorities for this year is to enhance recognition for research in Learning Technology and the launch of this year Award helps us achieve this aim. The ALT Learning Technologist of the Year Awards celebrate and reward excellent research and practice and outstanding achievement in Learning Technology. Established in 2007, the Awards have established a benchmark for outstanding achievement in Learning Technology on a national scale and attract competitive entries from the UK and internationally. All entries are reviewed by an independent judging panel chaired by the President of ALT. We gratefully acknowledge the support from our sponsors Catalyst, open course technologists, for supporting the Awards this year. The Awards are now open for entries.

We seek Member input for UNESCO Recommendation on Open Educational Resources (OER)

ALT is collating a response for the following UNESCO consultation. Please use this shared doc to provide input https://go.alt.ac.uk/2FsAbBA . If you would like to provide input for the response, please use the heading structure provided. Alternatively you can also email your contribution to maren.deepwell@alt.ac.uk . The deadline for responses is 1 June 2018.

General Data Protection Regulation (GDPR)

As I’m sure many of you are aware new data protection regulations come into effect at the end of this month. We welcome these regulations as they will hopefully permit greater transparence in the use of personal data including in learning and teaching. In preparation of the new regulation we have updated ALT’s own Privacy Policy and carried out a number of actions in-line with guidance from the ICO.

We have also been supporting Members with a series of webinars to raise awareness around GDPR in learning in teaching. As part of this we were delighted to host Martin Dougiamas, Moodle Founder and CEO, along with Gavin Henrick, Moodle Business Development Manager to highlight actions the Moodle community have taken, Stephan Geering, Blackboard Global Privacy Officer and Associate General Counsel, and Mark Glynn, Head of the Teaching Enhancement Unit at DCU. If you missed any of these sessions recording and resources have been added to the event pages accessible from the past events section of our website.

ALT Annual Survey data & report

I will conclude my report with a reflecting briefly on the findings from this year’s ALT Annual Survey, the report of which was published in March by my colleague Martin Hawksey. As with previous years the Annual Survey is designed to:

  • understand current and future practice;
  • show how Learning Technology is used across sectors; and
  • help map the ALT strategy to professional practice to better meet the needs of and represent our Members.

With the survey in its fourth year we are able to record and report and number of changes. This year some of the biggest changes are in the enablers and drivers for use of learning technology. The insights gained go beyond the trends in technology and organisational change, but help us understand the needs of staff enabling students and building a more empowered relationship with Learning Technology. Themes that we can look forward to exploring more at ALT’s Annual Conference this September.

Maren Deepwell

 Maren Deepwell, Chief Executive of the Association for Learning Technology (ALT), @marendeepwell

If you enjoyed reading this article we invite you to join the Association for Learning Technology (ALT) as an individual member, and to encourage your own organisation to join ALT as an organisational or sponsoring member

 

Researching alternative ways of measuring impact in Learning Technology

New altmetric donut from https://www.altmetric.com/blog/the-next-generation-of-altmetric-donut-is-here/

Last year I worked on finding a sustainable new home for the Open Access journal Research in Learning Technology. As part of my work for ALT, this was the third transition I have worked on since 2008 and during this period I have contributed to the thinking around Open Access publishing in Learning Technology, often through ALT’s contribution to initiatives such as the 2012/3 ‘Gold Open Access Project‘. This year I will be working with a new group set up by ALT to steer the future development of the journal:

A new Strategic Journal Working Group to help steer the development of the journal now being published by ALT in partnership with Open Academia has been established and we are grateful that representatives from other scholarly bodies who are publishing in a similar model, have agreed to join the group to share best practice and support each other. The group is chaired by Prof Neil Morris, who also chairs the Editorial Board, and we are delighted to welcome colleagues from ascilite, ILTA and the OLC alongside our Editors.

As well as learning from each other, the group is going to be examining alternative ways of measuring impact (alternative to the established impact factor, which the journal has not been awarded to date). This is an area I am particularly interested in for three reasons:

Knowledge exchange happens elsewhere
Firstly, much of the most cutting edge research and practice in Learning Technology is not published in formal journals. Even the most responsive Open Access peer-review system can’t necessarily keep pace with the quickly changing technology landscape we work in and so less formal ways of knowledge exchange on blogs, on social media or in project reports is often more important and useful.

Different media
Secondly, a lot of the most interesting ideas may be shared as videos, drawings, data visualisations and so on, in short, they may not easily fit into the traditional formats measures like an impact factor were designed for. What we cite and where we link to can be harder to track. As we use new technologies to communicate and share information the way in which we cite/link to sources needs to adapt.

Crossing boundaries
Another aspect of what makes measuring impact interesting in Learning Technology is the way we cross boundaries of disciplines in research, policy and practice. Coming from a discipline like Anthropology, which has a hugely broad frame of reference depending on what you specialise in, it still seems challenging to what extent the work of Learning Technologists crosses boundaries.

So, keeping all this in mind, here is where I am in my work to research alternative ways of measuring impact…

I started with a a blog post DOAJ LAUNCHES THE DOAJ BEST PRACTICE GUIDE which I came across as Research in Learning Technology was recently awarded the DOAJ best practice seal. It’s a useful new guide that provides a lot of helpful information to publishers, authors and policy makers interested in Open Access publishing. One of the resources it referred me to was a tool for authors called ThinkCheckSubmit. Whilst not specifically talking about how the impact of the journal is measured, it does ask authors to check the publisher’s information for example how the journal is indexed or whether the publisher is a member of OASPA or COPE.

Also in a blog post, this time on the altmetric website, I discovered that “the next-generation of Altmetric donut is here!“. If you are new to altmetrics, here is how they explain what it’s all about:

Altmetrics are metrics and qualitative data that are complementary to traditional, citation-based metrics. They can include (but are not limited to) peer reviews on Faculty of 1000, citations on Wikipedia and in public policy documents, discussions on research blogs, mainstream media coverage, bookmarks on reference managers like Mendeley, and mentions on social networks such as Twitter.

Sourced from the Web, altmetrics can tell you a lot about how often journal articles and other scholarly outputs like datasets are discussed and used around the world. For that reason, altmetrics have been incorporated into researchers’ websites, institutional repositories, journal websites, and more.

Whilst I have been familiar with altmetrics for some time, I hadn’t actually come across the history in much detail and I found it really helpful to visit http://altmetrics.org/manifesto/ and read up on some of the older posts. It gave a me a better insight into the thinking that informed the development of the tools and policies involved. It also reminded me of the 2014/5 HEFCE publication called “The Metric Tide” which includes an executive summary, literature review and correlation analysis. As part of the recommendations the report featured, it states:

These recommendations are underpinned by the notion of ‘responsible metrics’ as a way of framing appropriate uses of quantitative indicators in the governance, management and assessment of research. Responsible metrics can be understood in terms of the following dimensions:

  • Robustness: basing metrics on the best possible data in terms of accuracy and scope
  • Humility: recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment
  • Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results
  • Diversity: accounting for variation by field, and using a range of indicators to reflect and support a plurality of research and researcher career paths across the system
  • Reflexivity: recognising and anticipating the systemic and potential effects of indicators, and updating them in response.

The recommendations outlined in the report apply mostly to HEIs, funders and government bodies. There are some however that are directly aimed at publishers. These are:

Publishers should reduce emphasis on journal impact factors as a promotional tool, and only use them in the context of a variety of journal-based metrics that provide a richer view of performance. As suggested by DORA, this broader indicator set could include 5-year impact factor, EigenFactor, SCImago, editorial and publication times. Publishers, with the aid of Committee on Publication Ethics (COPE), should encourage responsible authorship practices and the provision of more detailed information about the specific contributions of each author. Publishers should also make available a range of article-level metrics to encourage a shift toward assessment based on the academic quality of an article rather than JIFs. (Publishers)

Publishers should mandate ORCID iDs and ISNIs and funder grant references for article submission, and retain this metadata throughout the publication lifecycle. This will facilitate exchange of information on research activity, and help deliver data and metrics at minimal burden to researchers and administrators. (Publishers and data providers)

Interestingly there are a number of recommendations for HEFCE and future REF exercises, that as far as I can tell seem to have not necessarily been picked up given the recent closure of HEFCE. Still, it is useful to revisit this report and its recommendations within the wider context of thinking about alternative ways of measuring impact.

I also came across an ebook that is new to me, published by altmetrics and Scholastica, entitled “The Evolution of Impact Indicators”.  The whole publications looks extremely useful and has a lot of references that are relevant to my work, but the chapter that I am particularly interested in is called “Beyond the Impact Factor”. It discusses a number of alternatives to the impact factor including the EigenFactor and the H-Index. The H-index is probably the one I am most familiar with, but it’s also useful to remind myself of how this is tracked:

Google Scholar: Google Scholar provides the h index for authors who have created a profile.

Publish or Perish: Publish or Perish is a software program that retrieves and analyzes academic citations from Google Scholar and provides the h index among other metrics. Publish or Perish is handy for obtaining the h index for authors who do not have a Google Scholar profile.

Scopus: Scopus provides a Citation Tracker feature that allows for generation of a Citation Overview chart to generate a h index for publications and citations from 1970 to current. The feature also allows for removal of self-citations from the overall citation counts.

Web of Science: Web of Science allows for generation of the h index for publications and citations from 1970 to current using the “Create Citation Report” feature.

Now that I have started to refresh my memory of some recent developments, my next step will be to take this back to my desk, do some work on the journal itself and compare notes with my colleagues from the other publishers.

Read more: