Which direction to take… researching alternative ways of measuring impact in Learning Technology

This is the second post about my current work on researching alternative ways of measuring impact in Learning Technology. Go back to the first post in which I have set out the context of my work and what I am particularly focused on.

Alongside the practical work with the ALT Journal Strategic Working Group, I am pleased that my proposal of a short session The quality of metrics matters: how we measure the impact of research in Learning Technology’ has been accepted for ILTA’s Annual Conference in Carlow, Ireland later this month. 

In the meantime, I have been doing more reading and research into innovative ways of measuring impact and this time my work has come up against some very practical questions, not least because as a UK-based publisher we are in the process of ensuring the the journal’s operations comply with the incoming GDPR legislation. Open Source journal systems are not at the forefront of compliance and like other independent publishers we work as part of the community to move towards compliance.

At first glance factors like GDPR may not seem to be closely related to how impact is measured, but my thinking links them closely as a lot of the opportunities around developing the journal are dependent on technical solutions that have data processing implications:

A convincing alternative
Discussing how important having an impact factor is quickly runs into the question of what the alternative looks like. As well as the technical challenges in implementing innovative tools or mechanism for measuring impact (to which the new GDPR legislation adds another level of complexity), the sustainability and longevity of both tool and data storage need to be examined. For example, introducing a tool like Altmetrics requires us to educate all stakeholders and ensure that the level of digital literacy required is not a barrier to making the tool useful. The user interface and experience needs to be robust and practical, building confidence in alternative or innovative ways of measuring impact. With new tools and platforms being created all the time there is a certain amount of churn and in order to really build a convincing alternative there needs to be a certain level of consistency.

Scrutiny of new vs. established ways of measuring impact
The kind of scrutiny with which we are examining alternative ways of measuring impact isn’t easily applied to the established method. There is a critical discourse, for example this recent blog post on the LSE impact blog, which argues:

Many research evaluation systems continue to take a narrow view of excellence, judging the value of work based on the journal in which it is published. Recent research by Diego ChavarroIsmael Ràfols and colleagues shows how such systems underestimate and prove detrimental to the production of research relevant to important social, economic, and environmental issues. These systems also reflect the biases of journal citation databases which focus heavily on English-language research from the USA and north and western Europe. Moreover, topics covered by these databases often relate to the interests of industrial stakeholders rather than those of local communities. More inclusive research assessments are needed to overcome the ongoing marginalisation of some peoples, languages, and disciplines and promote engagement rather than elitism.

It’s really helpful to read this kind of perspective, but in my experience there is a strong sense that institutions and senior management place much importance on the established value of the impact factor. We have decided to carry out consultation with stakeholders, but in the absence of a convincing alternative (which in our case we simply haven’t had time to implement as yet) I am not sure what we would be asking our stakeholders to compare or comment on. There is such a range of options being implemented by Open Access publishers, that we can a learn a lot from their example and work towards putting in place improvements that will help establish what might be an alternative or a complimentary perspective to the traditional impact factor.

Measuring beyond impact: peer review
Through our Editorial Board, the working group has now also begun to look at platforms like Publons, which promises to ‘integrate into the reviewer workflow so academics can track and verify every review and editorial contribution on the fly and in complete compliance with journal review policies’ (read more). It’s clearly a widely-used platform and some colleagues seem to be enthusiastic users, so it’s made me consider what this kind of platform could add to the user experience alongside innovative tools to measure impact. As a journal that does not charge any APCs, the value proposition for authors is clear, but resources to improve the experience of reviewers are limited. More work is needed for us in this area to examine whether we can compliment our efforts to improve the ways in which the impact is measuring could be complimented by enhancing the experience of peer review.


Read more (with thanks to everyone who’s sent me comments or links):

Information for publishers from DOAJ: 
DOAJ does not believe in the value of impact factors, does not condone their use on journal web sites, does not recognise partial impact factors, and advocates any official, alternative measure of use, such as article level metrics.

There is only one official, universally recognised impact factor that is generated by Thomson Reuters; it is a proprietary measure run by a profit-making organisation. This runs against the ethics and principles of open access and DOAJ is impact-factor agnostic. DOAJ does not collect metadata on impact factors. Displaying impact factors on a home page is strongly discouraged and DOAJ perceives this as an attempt to lure authors in a dishonest way.

Full information here.

#femedtech #OER18 #OER17… because equality matters for all of us

#femedtech #oer18

If you have been following the reporting on the gender pay gap in the UK, then this has been a sobering week indeed. You can search for the reports from different employers here. I have had a look through many of the education providers and sector bodies that I work with and the scale of the ‘gaps’ highlighted in some of the reports is staggering. Not a surprise, given my day to day experience of the sector, but still – staggering.

As a chief executive I have reflected much during this week on how we can change things across the system. There are so many aspects to the problem that there is definitely no shortage of things we should tackle and there is much to do in relation to the professionalisation of Learning Technology.

But on a more personal note this has also reminded me of how important it is that we continue to work towards achieving greater equality – in all its forms. So with a large international conference on openness in education just around the corner I hope that there’ll be much to learn and discuss from different, global perspectives. I also want to help give a voice to this conversation together with colleagues, and make sure that we consider equality in the context of openness.

Powerfully, Catherine Cronin spoke of criticality, equality and social justice at OER17 in London last year. In the closing plenary we were asked to respond to a call to action… #Iwill #OER17 and many participants in the room and on social media joined in, making their voices heard and sharing their aspirations, making a commitment to taking action. I think it’s time to renew our vows to take action #OER18.

Researching alternative ways of measuring impact in Learning Technology

New altmetric donut from https://www.altmetric.com/blog/the-next-generation-of-altmetric-donut-is-here/

Last year I worked on finding a sustainable new home for the Open Access journal Research in Learning Technology. As part of my work for ALT, this was the third transition I have worked on since 2008 and during this period I have contributed to the thinking around Open Access publishing in Learning Technology, often through ALT’s contribution to initiatives such as the 2012/3 ‘Gold Open Access Project‘. This year I will be working with a new group set up by ALT to steer the future development of the journal:

A new Strategic Journal Working Group to help steer the development of the journal now being published by ALT in partnership with Open Academia has been established and we are grateful that representatives from other scholarly bodies who are publishing in a similar model, have agreed to join the group to share best practice and support each other. The group is chaired by Prof Neil Morris, who also chairs the Editorial Board, and we are delighted to welcome colleagues from ascilite, ILTA and the OLC alongside our Editors.

As well as learning from each other, the group is going to be examining alternative ways of measuring impact (alternative to the established impact factor, which the journal has not been awarded to date). This is an area I am particularly interested in for three reasons:

Knowledge exchange happens elsewhere
Firstly, much of the most cutting edge research and practice in Learning Technology is not published in formal journals. Even the most responsive Open Access peer-review system can’t necessarily keep pace with the quickly changing technology landscape we work in and so less formal ways of knowledge exchange on blogs, on social media or in project reports is often more important and useful.

Different media
Secondly, a lot of the most interesting ideas may be shared as videos, drawings, data visualisations and so on, in short, they may not easily fit into the traditional formats measures like an impact factor were designed for. What we cite and where we link to can be harder to track. As we use new technologies to communicate and share information the way in which we cite/link to sources needs to adapt.

Crossing boundaries
Another aspect of what makes measuring impact interesting in Learning Technology is the way we cross boundaries of disciplines in research, policy and practice. Coming from a discipline like Anthropology, which has a hugely broad frame of reference depending on what you specialise in, it still seems challenging to what extent the work of Learning Technologists crosses boundaries.

So, keeping all this in mind, here is where I am in my work to research alternative ways of measuring impact…

I started with a a blog post DOAJ LAUNCHES THE DOAJ BEST PRACTICE GUIDE which I came across as Research in Learning Technology was recently awarded the DOAJ best practice seal. It’s a useful new guide that provides a lot of helpful information to publishers, authors and policy makers interested in Open Access publishing. One of the resources it referred me to was a tool for authors called ThinkCheckSubmit. Whilst not specifically talking about how the impact of the journal is measured, it does ask authors to check the publisher’s information for example how the journal is indexed or whether the publisher is a member of OASPA or COPE.

Also in a blog post, this time on the altmetric website, I discovered that “the next-generation of Altmetric donut is here!“. If you are new to altmetrics, here is how they explain what it’s all about:

Altmetrics are metrics and qualitative data that are complementary to traditional, citation-based metrics. They can include (but are not limited to) peer reviews on Faculty of 1000, citations on Wikipedia and in public policy documents, discussions on research blogs, mainstream media coverage, bookmarks on reference managers like Mendeley, and mentions on social networks such as Twitter.

Sourced from the Web, altmetrics can tell you a lot about how often journal articles and other scholarly outputs like datasets are discussed and used around the world. For that reason, altmetrics have been incorporated into researchers’ websites, institutional repositories, journal websites, and more.

Whilst I have been familiar with altmetrics for some time, I hadn’t actually come across the history in much detail and I found it really helpful to visit http://altmetrics.org/manifesto/ and read up on some of the older posts. It gave a me a better insight into the thinking that informed the development of the tools and policies involved. It also reminded me of the 2014/5 HEFCE publication called “The Metric Tide” which includes an executive summary, literature review and correlation analysis. As part of the recommendations the report featured, it states:

These recommendations are underpinned by the notion of ‘responsible metrics’ as a way of framing appropriate uses of quantitative indicators in the governance, management and assessment of research. Responsible metrics can be understood in terms of the following dimensions:

  • Robustness: basing metrics on the best possible data in terms of accuracy and scope
  • Humility: recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment
  • Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results
  • Diversity: accounting for variation by field, and using a range of indicators to reflect and support a plurality of research and researcher career paths across the system
  • Reflexivity: recognising and anticipating the systemic and potential effects of indicators, and updating them in response.

The recommendations outlined in the report apply mostly to HEIs, funders and government bodies. There are some however that are directly aimed at publishers. These are:

Publishers should reduce emphasis on journal impact factors as a promotional tool, and only use them in the context of a variety of journal-based metrics that provide a richer view of performance. As suggested by DORA, this broader indicator set could include 5-year impact factor, EigenFactor, SCImago, editorial and publication times. Publishers, with the aid of Committee on Publication Ethics (COPE), should encourage responsible authorship practices and the provision of more detailed information about the specific contributions of each author. Publishers should also make available a range of article-level metrics to encourage a shift toward assessment based on the academic quality of an article rather than JIFs. (Publishers)

Publishers should mandate ORCID iDs and ISNIs and funder grant references for article submission, and retain this metadata throughout the publication lifecycle. This will facilitate exchange of information on research activity, and help deliver data and metrics at minimal burden to researchers and administrators. (Publishers and data providers)

Interestingly there are a number of recommendations for HEFCE and future REF exercises, that as far as I can tell seem to have not necessarily been picked up given the recent closure of HEFCE. Still, it is useful to revisit this report and its recommendations within the wider context of thinking about alternative ways of measuring impact.

I also came across an ebook that is new to me, published by altmetrics and Scholastica, entitled “The Evolution of Impact Indicators”.  The whole publications looks extremely useful and has a lot of references that are relevant to my work, but the chapter that I am particularly interested in is called “Beyond the Impact Factor”. It discusses a number of alternatives to the impact factor including the EigenFactor and the H-Index. The H-index is probably the one I am most familiar with, but it’s also useful to remind myself of how this is tracked:

Google Scholar: Google Scholar provides the h index for authors who have created a profile.

Publish or Perish: Publish or Perish is a software program that retrieves and analyzes academic citations from Google Scholar and provides the h index among other metrics. Publish or Perish is handy for obtaining the h index for authors who do not have a Google Scholar profile.

Scopus: Scopus provides a Citation Tracker feature that allows for generation of a Citation Overview chart to generate a h index for publications and citations from 1970 to current. The feature also allows for removal of self-citations from the overall citation counts.

Web of Science: Web of Science allows for generation of the h index for publications and citations from 1970 to current using the “Create Citation Report” feature.

Now that I have started to refresh my memory of some recent developments, my next step will be to take this back to my desk, do some work on the journal itself and compare notes with my colleagues from the other publishers.

Read more:

Openness in education: a call to action for policy makers (cross-posted from Wonke)

Article on Wonke

I gratefully acknowledge the work of Lorna Campbell together with whom I wrote this article, and to David Kernohan for his editorial input. Read the full article on Wonke. 

This week is Open Education Week, a global initiative led by the Open Education Consortium to raise awareness about free and open educational opportunities.

This year it is particularly important for Higher Education as 2017 marked the anniversary of several groundbreaking initiatives that laid the foundations for what we now recognise as the open education movement. 2017 saw the 15th anniversary of the Budapest Open Access Initiative, and the release of the first Creative Commons licence, the 10th anniversary of the Cape Town Declaration, the 5th anniversary of the UNESCO Paris OER Declaration, and it was also the year that the new UNESCO OER Action Plan was launched.

Read the full article on Wonke. 

Policy making heaven: a look back at Pasi Sahlberg’s OEB keynote

Pasi Sahlberg at OEB17

I have been working on some articles about effective education policy this week and that prompted me to look back at Pasi Sahlberg’s contribution (slides available here) to the Opening Plenary at last December’s OEB conference.  It was an inspiring 20 min or so that combined hard hitting policy insight with a global perspective from the Finnish expert and culminated in a sing-a-long that makes the YouTube video worth watching!

In his talk there was a clear juxtaposition between making “successful education policies for the future we don’t know” (with examples from the UK, US and Australia amongst others) and “shaping the future we want by making successful policies that create equitable public education for all”. Some of the hallmarks of these kinds of policies are that they award trust-based responsibility, encourage professionalisation, reward risk taking and creativity and that they create cooperation. Not a lot to get right, but a stark contrast to the familiar examples of market driven competition we are seeing every day.

Sahlberg explained how we can get to education policy heaven by achieving the right balance between excellence and equity. Thinking about that made me go back to the call for action for openness in education ALT published late last year. It shows how we could take forward the kind of ‘heavenly’ policy making Sahlberg advocates in all education sectors in the UK in a very practical way.

With the OER18: Open to All conference only a few months away, there is a lot of work ahead to try and build on the successes of last year, the Year of Open, and make this kind of change happen on a national scale.

Sharing my approach to leadership as an open practice

photo 2It’s been nearly a year since I wrote my first post on leadership as an open practice, inspired by the 2015 OER conference. So in this post I want to reflect on how my experiment is going, what progress I have made and what’s next.

Where it all began…
In April last year, I wrote : “I’d like to try and adopt open practice in my role and connect with others who do the same. Like teachers, researchers or developers who share their practice and resources openly, I’ll try to follow their example. To make my work, which is mostly about leadership, governance and management in Learning Technology, an open practice.”

Putting the experiment to the test
Since then, I took part in the #rhizo15 course/community and the #blimage challenge, I have shared a number of conference presentations and blog posts about CPD, policy and current issues. I have been building and sharing my CMALT portfolio (specialist area: leadership as an open practice) and reflected on different aspects of open practice.  This blog has become a really helpful tool for engaging with different aspects of the work I do, share my thoughts and reflect openly. It’s certainly prompted me to do more thinking in the open and has resulted in many conversations and comments that have been helpful and stimulating (thank you!). It’s also motivated me to engage with others’ blogs and outlets, reading and commenting or contributing in turn. Sharing the template for how I built my CMALT portfolio with Google Apps is another example of this approach in action. My original aim was to share, connect and engage more openly and I think that aspect of my open practice has definitely developed.

Difficult aspects of leadership as an open practice
Although it has been hugely rewarding, leadership as an open practice has also been quite challenging. While I have certainly started to find more like-minded professionals in similar roles there have been many more false leads, e.g. blogs that are more marketing than sharing, open-sounding practice that leads to pay-walls and a definite reluctance to connect beyond networking for fear of loosing some sense of being ahead, of having the edge over others in leadership roles. At times when political or economic turmoil threatens funding or jobs open practice seems to become a lot more difficult and far less popular for people in similar roles to mine.

It has also been difficult at times to manage different aspects of my practice when my ‘day job’ as a CEO comes into contact with other work I do. When I contribute to a discussion or a twitter chat I try and make it clear whether I am representing the organisation I work for or whether I am participating in a less formal capacity, but it’s not always easy to make these distinctions. On the other hand there are real advantages to having the chance to get involved with research or practice in a more hands on way and it helps me be better at the work I do as a CEO.

With managing different identities also comes being a woman and a leader in Learning Technology and this is probably where my experiment has delivered the most rewarding examples and connections. Through a wealth of media I have become more familiar with the work other women do to drive forward technology in learning and teaching, from writers and IT Directors to CEOs and teachers both younger or more experienced than me. While in my  experiences day to day there is still a long way to go to achieve equality for women decision makers in government, industry or funding bodies my growing network makes me feel hopeful.

Take away’s
So, one year on, what are my take away’s from this experiment in leadership as an open practice? Here goes:

  1. Will I continue? Yes! It’s been such a rewarding experience, stimulating and challenging that I will definitely keep going;
  2. What’s the best bit? The freedom that an open approach help me establish, the prompts to follow whatever I was curious about and the generous feedback from peers;
  3. What’s the worst bit? For me at times lack of peers in comparable job roles who are interested in open practice;
  4. What’s next? On a practical front, more #rhizo16 this year, some opportunities to speak at events or contribute to other projects, making more of an effort to communicate and connect with others… and hopefully to become better at leadership as an open practice.

Your thoughts?
Over the past year I have had many comments/conversations prompted by blog posts or tweets and it’s been extremely helpful. So if you have any comments or feedback on my approach to leadership as an open practice or your own experience, share it below or tweet me @marendeepwell.

#OpenEducationWk: Openness for eternity?

Open Education Week 

It’s #OpenEducationWk and I’ve been inspired by activities and blog posts from across the community, including a special edition of the #LTHEchat (helpful intro here) and a number of webinars organised by the ALT Open Education Special Interest Group (including a preview today of the OER16 Open Culture conference coming up in April). Seeing so much commitment to and enthusiasm for scaling up open practice and resources has been a joy – but it’s also made me think about sustainability in the long term.

Many of us have plenty of problem solving to do right here, right now – and planning for the long term is not often a top priority. When it comes to creating open educational resources or sharing open practice it can be hard enough to do in the first place without thinking about how sustainable a particular piece of work might be when someone comes across it in 10 or 20 years. One of the immediate benefits of sharing something can be the feedback from colleagues, the conversation and knowledge exchange it stimulates and the connections we build through them. So our networks grows bigger and stronger and become more sustainable.

But what about our resources we share? What about them in the long run? In Learning Technology in particular many are fond of big sweeping statements (see Audrey Watter’s Hack Education project for an eye opening reality check) that make it sound as if a project or initiative solves a particular problem once and for all (and for everyone). Hyperbole along the lines of “no more textbooks – EVER” or “the END of the universities” makes it sound like we operate only a stone’s throw away from #edtech nirvana over the horizon. But without the right meta data, without considering interoperability, without updating and re-sharing things much of what we create remains useless to others. Enabling others to find and make sense of resources or assess their usefulness in their context is challenging while looking for what you need often leads to broken links, missing licences, taxonomies that only make sense to those who designed them and repositories that have long since fallen into dis-use.

Image source

Institutional structures, if they support openness, can help with some of these issues while we have access to them. But when jobs change, people move on or institutions evolve these internal structures can become inaccessible.  Like the huge aircraft boneyards that become material metaphors of the age of air travel, our open landscape has its own spaces where all the dead OERs reside. When the lifespan of open resources is so limited the investment they represent also has limited benefit. Particularly when it comes to publicly funded resources there is a lot more we could do to ensure that what funding there is has an impact beyond its immediate beneficiaries.

This is why policy is so important. The work Creative Commons are leading in the US for instance (here is further info about their #GoOpen campaign) or the work of the Open Education Consortium help create robust ways to enabling open practice, create and share open resources at scale – to ensure sustainability in the long term. When institutions embrace openness, like the University of Edinburgh has done recently by adopting a new OER policy, they bring us a step closer to making openness sustainable in the long term.

Which brings me make to where I started: openness for eternity? How can we make it work in the long term? Make sharing openly sustainable, scale-able and useful? From the global movement via national and institutional policies to individual practice it is a formidable undertaking. Weeks like this, #OpenEducationWk, show that we are making progress.