All posts by scott101

Going Green: a year of supporting the HEFCE OA Policy

A year since HEFCE linked the mandatory deposit of accepted manuscripts to the allocation of QR funding, this post describes how the Library’s Scholarly Communications Team has helped academic colleagues successfully adapt to the new ruling which has resulted in an unprecedented level of Green OA deposits to our repository.

As we entered 2016, developing support services for the incoming policy was the team’s highest priority. The biggest obstacle was a very low Green OA baseline: from 2013 to early 2016, the number of Manchester-authored papers available through pure or hybrid Gold OA journals rose steadily; yet annual Green OA deposits to our repository stalled at ten percent of the institution’s journal article output over the same period.

During a pilot the previous year a variety of support models were tested with selected schools and we found that the highest compliance was achieved when the responsibility on authors was minimal and the Library offered to deposit and set embargoes on their behalf. So in February, we made a proposal to develop a fully mediated service which was accepted by the University’s senior research governance committee. The task then was to reorient our existing Gold APC service to encompass Green OA workflows which scaled to enable our team of six staff to support deposit of ~6000 papers a year.

To allow authors to send us their manuscripts we created an authenticated deposit form (branded as the Open Access Gateway) with just two input fields: acceptance date and journal/proceedings title, plus a drag and drop area to attach the file. Authors could also use the form to request Gold OA payment if they acknowledged a grant from a qualifying funder.

In the months before the policy launched we worked closely with Library marketing colleagues to deliver a communications campaign which raised awareness of the policy and built momentum behind the new service. Our message to academics was, ‘Just had a paper accepted? Now make sure it’s REF eligible.’

OA A5 flyer cover

In April the policy launched, our deposit form went live, and the firehose to the University’s publications opened. Early signs were promising; in the first weeks, we received roughly ten manuscripts per working day which represented a significant increase against our baseline. However, more persuasion was needed for those authors across campus who weren’t sending us their manuscripts. We therefore began to chase those authors via email and sent a follow-up email, copying in a senior administrator, if we had no response.

We particularly focussed on non-compliant papers with the highest altmetric scores which had an increased likelihood of being selected for REF. The chaser emails were effective and many authors replied immediately with the manuscript attached. Of course, our emails also prompted some authors to ask questions or offer opinions about the policy which required additional resourcing. 

Sending chaser emails significantly raised the institution’s compliance rates, and it was clear that we would need to continue to do this systematically as awareness of the policy gradually spread across campus. This placed additional strain on the team as searching Scopus for papers and chasing authors proved an intensive process. We explored alternatives (e.g. using the ORCID API to identify our new papers) but no automated process was as effective as the painstakingly manual check of Scopus search results.

By August we’d refined our reporting to the University to include school/division level compliance against projections. To achieve this we recorded the compliance status and author affiliations of every single University paper falling within the scope of the policy in a masterfile spreadsheet. We then used SciVal to calculate the average output of the REF-eligible staff from each of the 33 schools/divisions over the past five years. This enabled us to project how many accepted papers we would expect each school/division to have per month during the first twelve months of the policy. Every month we produce a compliance report, like the one below, which supports standing agenda items at research governance committee meetings across the University.

 

REF_OA_report_blog_post
Monthly REF OA Faculty/School compliance report

 

As we moved into the new academic year, monthly deposits continued to rise. The team were at maximum capacity processing incoming manuscripts, so to speed up the flow of papers through our assembly line we purchased a license for the online forms service Typeform and developed a back office ‘If-This-Then-That’ style form. This allowed us to distil esoteric funder and publisher information into a simple workflow enabling papers to be processed with reduced input from higher grade staff.

Now, twelve months on from the introduction of probably the most swingeing OA policy ever introduced, we can take stock of how well we have adapted to the new ruling.  In the absence of a Snowball-type metric for measuring our compliance we currently track compliance three ways:

  • % top papers compliant: In January, the University ran its annual Research Review Exercise, involving research staff proposing their best post-2014 papers for internal review. This provided the first opportunity to gauge compliance of those papers with the highest chance of being returned in the next REF. During the exercise, a total of 1360 papers were proposed for review which were within the scope of the policy, of these a very encouraging 95% were compliant with the new OA requirements. 
  • % compliant against projections: Our chosen metric for reporting compliance to the University. We report the proportion of compliant papers with at least one REF-eligible author against the total number of papers we would have expected to have been accepted by all our REF eligible staff. Against this measure, 68% of the papers are compliant, 7% are non-compliant, and 25% are not currently recorded. Many of these unrecorded papers will not yet be published so we will chase them once they are indexed in Scopus. A large number of our papers arising from the ATLAS/LHCb collaborations are also available via Gold OA and are compliant but we have not yet recorded them in our masterfile.
  • % compliant overall: To date, we’ve recorded 4656 papers of which 4031 (87%) are compliant, 459 (10%) are not compliant, and 166 (3%) are being actively chased by our team.

In total there’s a 55% Green OA/45% Gold OA split, and given that Green OA represents more inconvenience than most of our academic colleagues unfamiliar with arXiv have ever been willing to tolerate, it is very unlikely indeed that the University would have achieved such high compliance had the Library not provided a mediated Green OA deposit service. The data confirms our approach helped make Green Open Access an organisational habit practically overnight. 

The approach has come at a cost however; over the past year, supporting the HEFCE OA policy has taken up the majority of the team’s bandwidth with most of our 9am-5pm conversations being in some way related to a paper’s compliance with one or more funder OA policy.

Now that our current processes have bedded in, and in anticipation of the launch of the new UK Scholarly Communications License (UK-SCL) – for more on this read Chris Banks’s article or watch her UKSG presentation – and further developments from Jisc, we hope that over the next 12 months we can tilt the balance away from this reductionist approach to our scholarly output and focus on other elements of the scholarly communication ecosystem. For example, we are already in discussions with Altmetric about incorporating their tools into our OA workflows to help our academics build connections with audiences and are keen to roll this out soon – from early conversations with academics we think this is something they’re really going to like.

Whatever lies in store, it’s sure to be another busy year for the team.

Seven tips for promoting ORCIDs to busy academics

Administrators of research are sold on the potential benefits of ORCIDs – but the ultimate success of the identifier relies on buy-in from academics too. Global ORCID take up is rising but hasn’t yet reached the tipping point at which significant benefits start to accrue. Until then, it’s fair to say that advocacy is still needed to convince many academics what’s in it for them.

The pipelines for ORCIDs to flow in and out of the world’s research management databases are under construction. New integrations with institutional, publisher, and vendor systems come online all the time and soon the infrastructure will be in place to enable serious improvements to the way research is administered.

Meanwhile, it’s increasingly difficult for academics to avoid claiming an ORCID and continue to publish the findings of their funded research. At Manchester, for example, we’ve achieved high levels of uptake amongst our REF-eligible academics by making it a requirement of a recent research assessment exercise.

REF-eligible academics with ORCIDs (September 2015)

Picture1

REF-eligible academics with ORCIDs (December 2015)

Picture2

But whilst most of our academics now have an identifier, some that we’ve spoken to are unconvinced that investing further time to link their activities to their ORCID record is worthwhile right now. Addressing this, a Library-led project is now rolling out a communications campaign highlighting benefits and dispelling myths around the use of ORCIDs.

With generous but finite resource the ORCID team are attempting to solve the author name ambiguity problem for the entire world, and rely on this type of advocacy initiative at the institution level to maintain momentum. So based on our early experiences, here are our seven tips if you or your team are tasked with developing a communications plan to support your own institution’s ORCID strategy.

1. Protect time for desk research

Perhaps obvious, but before tapping up contacts for invitations to committee meetings clear an afternoon to immerse yourself in all things ORCID (the FAQs page on the ORCID website is a great place to start). On the face of it ORCIDs are a simple concept, but beneath the surface are more esoteric nuances. For example try answering the question, “Is the ORCID registry a single-point-of-truth for all of an academic’s research activities?” and see where this leads you.

2. Perfect your ORCID elevator pitch

When the invitations come in you might only have five minutes on a packed school board agenda so get your key messages clear in your own head first. Try your slides out on a friendly audience to get it perfect. It’s easy to get tied up in knots explaining abstract concepts like ‘round-tripping’ so the more preparation the better.

3. Tailor your message to engage your audience

You may need to spend extra time on background for a humanities audience where (at least at Manchester) there’s lower awareness levels than in the STEM disciplines. Also be sure to highlight key developments of particular relevance to the audience. For example biomedicine researchers will be interested in the Wellcome Trust mandating inclusion of ORCIDs in grant applications; physics/maths/computer science academics may be interested in arXiv’s use of ORCIDs to replace the internal arXiv author ID; and the MLA International Bibliography integration would be of most interest to humanities researchers.

4. Make the problem real

Asking if anyone shares their name with another academic is a good way to make the name ambiguity problem real. Invariably at least one hand goes up and you sometimes get an interesting anecdote too. It’s a good way to break the ice and prompts those in the room to acknowledge the problem.

5. Be honest!

Get ahead of the accusations that this is just for the ‘bean-counters’ by being up-front about the benefits to administrators – in many ways it’s easier to articulate these benefits anyway. And if you can sense that you’re not winning the argument then don’t be afraid to say “Look guys this is going to be 15 minutes of your life that you’re not getting back – it’s a question of when not if you do this because it isn’t going away.”

6. Emphasise the registry

The social network style ORCID profile page has created misconceptions that ORCID is just another social network. We’ve heard comments along the lines of, “I already have Academia.edu and LinkedIn – I don’t need another site to keep up to date”. It’s important to stress that ORCID is first and foremost a registry allowing data to be transferred between these types of systems, ultimately reducing the keystroke burden to the academic.

7. Anticipate the tricky questions

And finally, try to anticipate the tricky questions whether they be technical (eg “Who has access to the API?”) or more philosophical (eg “This sounds a bit ‘Big Brother’ to me”), and have answers prepared for them. However much you prepare you’ll not be able to anticipate every question. For example, following one presentation we delivered recently in which we quoted the fact that in China, people with the top three family names (Li, Wang, and Zhang) account for 21% of population (nearly 300 million), one academic remarked in exasperation “What’s wrong with just using my name?!”

Altmetric website screen

Altmetric for Institutions

Altmetric early adopter post
Implementing Altmetric at our institution: Manchester

Altmetrics offer new and powerful ways to track research impact and engagement beyond traditional methods like citation counts and impact factors.

Read about our experiences as an early adopter of the ‘Altmetric for Institutions’ application on the Altmetric blog.

And if you’re on campus then check out the application by visiting Altmetric Explorer.

Northern Collaboration Conference 2014

Northern Collaboration Conference 2014

Stephen and I about to deliver our presentation.

Last week Stephen and I were very happy to be invited to talk at the Northern Collaboration Conference in Darlington about the development of our new Citation Services at the University of Manchester.

Northern Collaboration is a group of university libraries in the north of England with the aim of establishing closer collaboration in the development and delivery of library services, and this was the second year that the conference has been held.

There was plenty of interest in our breakout session as we talked about the work The University of Manchester Library does in providing citation benchmarking analyses, standard reporting, and training workshops. The session might not have been so well attended had there been prior warning about the test we gave them half way through the presentation. Thankfully everyone passed!

The event was a great opportunity to showcase the Library’s achievements but it was also really good to meet colleagues in other academic libraries and learn more about some of the innovative developments across the sector.

See the Northern Collaboration Conference 2014 Twitter backchannel for a flavour of the day.

Altmetric website screen

Library trialling new ‘Altmetric for Institutions’ application

Altmetric logo
Logo of Altmetric LLP

The Library is working in partnership with Altmetric LLP to trial their new ‘Altmetric for Institutions’ application. We’re really excited to be one of the first Universities to have access to the platform which provides powerful insights into the online attention that our researchers’ outputs attract.

So far we’re getting lots of positive feedback which we’ll share when we write up our experiences of the trial later in the year.

In the meantime, if you’re a member of staff or a student at The University of Manchester and would like to take part in the trial please get in touch to find out how.