Category Archives: Report

A Research Data Librarian’s experience of OpenCon2017

Photo: R2RC.org, CC-0After following and participating in the OpenCon Librarian calls for much of the last year I was delighted to win a partial scholarship to OpenCon 2017. The monthly calls had raised my awareness of the variety of Open Access, Education and Data initiatives taking place elsewhere and I was keen to learn more about others’ advocacy efforts with students, librarians, policy makers, social entrepreneurs and researchers from around the world.

Too often when discussing Open Access and Data it seems that researchers, librarians and policy makers are at separate conferences and having separate conversations; so it is great that OpenCon brings together such a diverse group of people to work across national, disciplinary and professional boundaries. Thus I was very excited to arrive in Berlin for a long weekend working with a dedicated group of advocates on how to advance Open Research and Education.

The weekend started with a panel of inspiring early career professionals discussing the initiatives they are working on which showed the many different levels it is possible to influence academic culture. These included Kholoud Al Ajarma’s work enabling refugee children to tell their stories through photography, the Bullied into Bad Science campaign which supports early career researchers in publishing ethically, Robin Champieux’s efforts to affect grassroots cultural change and research into how open science is (or is not!) being incorporated into Review, Promotion and Tenure in American and Canadian universities. Learning about projects working at the individual, institutional, and national level was a great way to get inspired about what could be achieved in the rest of the conference.

Photo: R2RC.org, CC-0

This emphasis on taking practical action was a theme of the weekend, OpenCon is not an event where you spend much time listening! After sharing how we all came to be interested in ‘Open’ during the stories of self on Saturday afternoon, we plunged into regional focus groups on Sunday working on how we can affect cultural change as individuals in large institutions.

The workshops used design thinking, so we spent time thinking through the goals, frustrations and preoccupations of each actor. This meant that when we were coming up with strategies for cultural change they were focused on what is realistic for the people involved rather than reaching for a technical solution with no regard to context. This was a great chance to talk through the different pressures facing researchers and librarians, understand each other’s points of view and come up with ways we can work in alliance to advocate for more openness.

During the do-athon (think a more inclusive version of a hackathon) I spent much of my time working with a group lead by Zoe Wake Hyde looking at Open Humanities and Social Sciences which was born out of one of the unconference sessions on the previous day.

When discussing Open Research, and particularly Open Data, the conversation is frequently geared towards the type of research and publishing which occurs in the physical sciences and so solutions do not take account of the challenges faced by the Humanities and Social Sciences. These challenges include a lack of funding, less frequent publishing which puts more pressure on each output, and the difficulties of making monographs Open Access. Often at conferences there are only a couple of us who are interested in the Humanities and Social Sciences so it was great to be able to have in depth discussions and start planning possible actions.

During the initial unconference session we talked about the differences (and potential conflicts) between Digital Humanities and Open Humanities, the difficulties in finding language to advocate effectively for Open in the Humanities, and the difficulty of sharing qualitative social sciences data. It was reassuring to hear others are having similar difficulties in getting engagement in these disciplines and, whilst trying to avoid it turning into a therapy session, discuss how we could get Humanities and Social Sciences to have a higher profile within the Open movement. It was by no means all discussion and true to stereotype several of our group spent the afternoon working on their own getting to grips with the literature in this area.

It was inspiring to work together with an international group of early career researchers, policy makers and librarians to get from an initial discussion about the difficulties we are all facing to a draft toolkit for advocates in little over 24 hours. Our discussions have continued since leaving Berlin and we hope to have a regular webchat to share best practice and support each other.

Whilst getting involved with practical projects was a fantastic opportunity my main takeaway from the weekend was the importance of developing a wider and more inclusive perspective on Open Research and Education. It is easy to lose sight of these broader goals when working on these issues every day and getting bogged down in funder compliance, the complications of publisher embargoes and the technical difficulties of sharing data.

The Diversity, Equity and Inclusion panel focused on the real world impact of openness and the importance of being critical in our approaches to openness. Denise Albornoz spoke powerfully on recognising the potential for Open Research to perpetuate unequal relationships across the world with wealthy scientists being the only ones able to afford to publish (as opposed to being the only ones being able to afford to read the literature) and so silencing those in developing countries. Tara Robertson highlighted the complicated consent issues exposed through opening up historic records, Thomas Mboa focused on how Open Access prioritises Western issues over those important in Africa, and Siko Bouterse spoke about the Whose Knowledge project which campaigns on ensuring knowledge from marginalised communities is represented on the Internet.

This panel, much like the whole of OpenCon, left me reflecting on how we can best advance Open Access and Open Data and re-energised to make a start with new allies from around the world.

Photo by: Rachael Ainsworth, License: CC-BY

An Astronomy Open Science Champion’s experience of OpenCon2017

Advocating for openness in research is a big part of the work we do in the Library’s Research Services team. Trying to win the hearts and minds of skeptical researchers can be a challenge but increasingly we find that we are having conversations with researchers who are themselves advocates for open research. Facilitating the development of a network of open champions across campus is something we’re keen to do more of and two recent examples of this work are holding an Open Research Forum in Open Access Week and funding Rachael Ainsworth, an Early Career Researcher, to attend OpenCon2017. To do our job well we also need to be involved in developments and discussions, so we were delighted that Rosie Higman, a member of our team, won a sponsored place at OpenCon2017. Read about Rachael’s experience at OpenCon here and come back to read about Rosie’s later in the week…

AinsworthRround

Hello! I am Rachael Ainsworth, a Research Associate in Radio Astronomy at the Jodrell Bank Centre for Astrophysics (JBCA) here within the University of Manchester. I am the Open Science Champion in my department where I advocate, give presentations and organise events relating to Open Science in Astronomy. I am also in the current cohort of Mozilla Open Leaders, working on the project Resources for Open Science in Astronomy (ROSA): an Open Science how-to kit for astronomers to help them research openly from proposal to publication. Are you running or starting an open project and want to grow as an open leader? Apply now for the next round of Mozilla Open Leaders! You can view my application for Round 4 on my GitHub here 🙂

Photo by: R2RC, License: CC0, Edited by: Rachael Ainsworth

I applied to attend OpenCon 2017 to be inspired by and network with other pioneers of the Open Movement. There were thousands of applicants for this year’s event from over 175 countries, but there were only a few hundred places at the conference to represent our global community. I was waitlisted to attend based on my main application (which you can read on my GitHub here along with the response from the OpenCon 2017 Organising Committee). This was pretty good considering the odds, but I was still gutted. However, I was lucky enough to see that the University of Manchester Library was holding a competition to sponsor a student or staff member to attend. I therefore remixed my main application to answer the University of Manchester-specific questions (which you can read on my GitHub here) and submitted it to the competition. I was very happy when it was announced that I won the sponsored place!  

I arrived at OpenCon ready to dive into the challenges still facing Open, collaborate and brainstorm actionable solutions – big and small. I gained a lot through the European regional workshop – How might we help individuals shape the culture around them in a university? We broke into groups to establish personas/stakeholders associated with our workshop topic, we considered their pains and gains, and brainstormed potential solutions to the challenges they face. I worked in a group focusing on the persona of a 30-something year old researcher, discouraged by toxic culture in academia and seeking allies to make it a more open and inclusive environment. You could say her challenges resonated with me 🙂

As a larger group, we voted on which problems/challenges we wanted to discuss further in the second half of the workshop. We then broke into new groups based on the topics we wanted to work on, and I chose the group addressing “How might we tackle time issues?” as many researchers perceive that open science practices will involve extra time and effort without much reward. It turns out that a how-to kit and templates could be a good solution to this problem. As a result, I have met enthusiastic people to collaborate with on my Mozilla Open Leadership project, ROSA.

Since I knew I would be writing a blog post to reflect upon my OpenCon experience, I participated in the Unconference session: “How can openness be advanced with podcasting, blogging and other DIY media?” I am not a natural when it comes to blogging, vlogging, podcasting or whatever the kids are doing these days, so I went to this session to learn from those that are. We discussed how to be more effective science communicators through Open Media, and joined together to form the OpenComm Network, a group to share resources, best practices, and openly licensed content to support science communication based on our various backgrounds and expertise.

During the Do-a-thon sessions on Day 3 of the conference, the OpenComm Network collaborated to record a podcast and write a blog post around Open Media and our OpenCon experience. We set up a mini recording studio in the cloakroom for interviews and answered prompts such as what does Open Media mean to you? What are the challenges to communicating about Open issues? How would you describe your experience at OpenCon?

Photo by: Rachael Ainsworth, License: CC-BY

We then transcribed the interviews, edited the recordings, and re-wrote the transcriptions into content for the blog. Because we only had a few hours for the Do-a-thon, we ran out of time to complete our goal, but you can hear version 0.1 of our podcast here and read version 0.1 of our blog post here. We hope to have full version 1.0s at some point, but I quite like that this session resulted in a demonstration of Open Media and collaboration in progress! In the meantime, you can hear my interview here 🙂

Photo by: R2RC, License: CC0, Edited by: Rachael Ainsworth

The most impactful session/moment of OpenCon 2017 was hands down the Diversity, Equity and Inclusion panel. I won’t write too much about it here, because you absolutely need to watch and listen to it for yourself (skip to 7:47:00):

 

Photo by: R2RC, License: CC0, Edited by: Rachael Ainsworth

Through their stories, the panelists reminded us to stay critical, pay careful attention to who is missing from the room, to who is writing policy/history, and to deliberately collaborate with underrepresented communities. I was moved to tears and after three standing ovations for this session, I was eager to return to Manchester to turn these insights gained into action.

Photo by: R2RC, License: CC0, Edited by: Rachael Ainsworth

I cannot thank the University of Manchester Library enough for sending me to OpenCon 2017, and I am looking forward to working closely with them to advocate for openness across our campus and encourage researchers to take advantage of the resources and training available through the Library’s services. Next up: collaborating with Research Data Management to conduct training as part of the JBCA Autumn Computing Sessions (JACS) in December, to train postgraduate students on best open data practices!

 

How effective is your RDM training?

We are involved in an international collaborative project  to assess the quality of the Research Data Management training across institutions. This post reports on progress of the project so far, it originally appeared on the project blog on 6th October 2017. 

When developing new training programmes, one often asks oneself a question about the quality of training. Is it good? How good is it? Trainers often develop feedback questionnaires and ask participants to evaluate their training. However, feedback gathered from participants attending courses does not answer the question how good was this training compared with other training on similar topics available elsewhere. As a result, improvement and innovation becomes difficult. So how to objectively assess the quality of training?

In this blog post we describe how, by working collaboratively, we created tools for objective assessment of RDM training quality.

Crowdsourcing

In order to objectively assess something, objective measures need to exist. Being unaware of any objective measures for benchmarking of a training programme, we asked Jisc’s Research Data Management mailing list for help. It turned out that a lot of resources with useful advice and guidance on creation of informative feedback forms was readily available, and we gathered all information received in a single document. However, none of the answers received provided us with the information we were looking for. To the contrary, several people said they would be interested in such metrics. This meant that objective metrics to address the quality of RDM training either did not exist, or the community was not aware of them. Therefore, we decided to create RDM training evaluation metrics.

Cross-institutional and cross-national collaboration

For metrics to be objective, and to allow benchmarking and comparisons of various RDM courses, they need to be developed collaboratively by a community who would be willing to use them. Therefore, the next question we asked Jisc’s Research Data Management mailing list was whether people would be willing to work together to develop and agree on a joint set of RDM training assessment metrics and a system, which would allow cross-comparisons and training improvements. Thankfully, the RDM community tends to be very collaborative, which was the case also this time – more than 40 people were willing to take part in this exercise and a dedicated mailing list was created to facilitate collaborative working.

Agreeing on the objectives

To ensure effective working, we first needed to agree on common goals and objectives. We agreed that the purpose of creating the minimal set of questions for benchmarking is to identify what works best for RDM training. We worked with the idea that this was for ‘basic’ face-to-face RDM training for researchers or support staff but it can be extended to other types and formats of training session. We reasoned that same set of questions used in feedback forms across institutions, combined with sharing of training materials and contextual information about sessions, should facilitate exchange of good practice and ideas. As an end result, this should allow constant improvement and innovation in RDM training. We therefore had joint objectives, but how to achieve this in practice?

Methodology

Deciding on common questions to be asked in RDM training feedback forms

In order to establish joint metrics, we first had to decide on a joint set of questions that we would all agree to use in our participant feedback forms. To do this we organised a joint catch up call during which we discussed the various questions we were asking in our feedback forms and why we thought these were important and should be mandatory in the agreed metrics. There was lots of good ideas and valuable suggestions. However, by the end of the call and after eliminating all the non-mandatory questions, we ended up with a list of thirteen questions, which we thought were all important. These however were too many to be asked of participants to fill in, especially as many institutions would need to add their own institution-specific feedback questions.

In order to bring down the number of questions which should be made mandatory in feedback forms, a short survey was created and sent to all collaborators, asking respondents to judge how important each question was (scale 1-5, 1 being ‘not important at all that this question is mandatory’ and 5 being ‘this should definitely be mandatory’.). Twenty people participated in the survey. The total score received from all respondents for each question were calculated. Subsequently, top six questions with the highest scores were selected to be made mandatory.

Ways of sharing responses and training materials

We next had to decide on the way in which we would share feedback responses from our courses and training materials themselves . We unanimously decided that Open Science Framework (OSF) supports the goals of openness, transparency and sharing, allows collaborative working and therefore is a good place to go. We therefore created a dedicated space for the project on the OSF, with separate components with the joint resources developed, a component for sharing training materials and a component for sharing anonymised feedback responses.

Next steps

With the benchmarking questions agreed and with the space created for sharing anonymised feedback and training materials, we were ready to start collecting first feedback for the collective training assessment. We also thought that this was also a good opportunity to re-iterate our short-, mid- and long-term goals.

Short-term goals

Our short-term goal is to revise our existing training materials to incorporate the agreed feedback questions into RDM training courses starting in the Autumn 2017. This would allow us to obtain the first comparative metrics at the beginning of 2018 and would allow us to evaluate if our designed methodology and tools are working and if they are fit for purpose. This would also allow us to iterate over our materials and methods as needed.

Mid-term goals

Our mid-term goal is to see if the metrics, combined with shared training materials, could allow us to identify parts of RDM training that work best and to collectively improve the quality of our training as a whole. This should be possible in mid/late-2018, allowing time to adapt training materials as result of comparative feedback gathered at the beginning of 2018 and assessing whether training adaptation resulted in better participant feedback.

Long-term goals

Our long-term goal is to collaboratively investigate and develop metrics which could allow us to measure and monitor long-term effects of our training. Feedback forms and satisfaction surveys immediately after training are useful and help to assess the overall quality of sessions delivered. However, the ultimate goal of any RDM training should be the improvement of researchers’ day to day RDM practice. Is our training really having any effects on this? In order to assess this, different kinds of metrics are needed, which would need to be coupled with long-term follow up with participants. We decided that any ideas developed on how to best address this will be also gathered in the OSF and we have created a dedicated space for the work in progress.

Reflections

When reflecting on the work we did together, we all agreed that we were quite efficient. We started in June 2017, and it took us two joint catch up calls and a couple of email exchanges to develop and agree on joint metrics for assessment of RDM training. Time will show whether the resources we create will help us meet our goals, but we all thought that during the process we have already learned a lot from each other by sharing good practice and experience. Collaboration turned out to be an excellent solution for us. Likewise, our discussions are open to everyone to join, so if you are reading this blog post and would like to collaborate with us (or to follow our conversations), simply sign up to the mailing list.

Resources

Mailing list for RDM Training Benchmarking: http://bit.ly/2uVJJ7N

Project space on the Open Science Framework: https://osf.io/nzer8/

Mandatory and optional questions: https://osf.io/pgnse/

Space for sharing training materials: https://osf.io/tu9qe/

Anonymised feedback: https://osf.io/cwkp7/

Space for developing ideas on measuring long-term effects of training: https://osf.io/zc623/

Authors (in alphabetical order by surname):

Cadwallader Lauren, Higman Rosie, Lawler Heather, Neish Peter, Peters Wayne, Schwamm Hardy, Teperek Marta, Verbakel Ellen, Williamson, Laurian, Busse-Wicher Marta

Illustrated London News visualisation

Discussing digital scholarship at the second Digital Humanities Library Lab

Digital Humanities Second Library Lab photoThis month I delivered Digital Humanities Second Library Lab, a hands-on showcase of digital library collections and tools created for the purpose of innovative research using computational methods. This three-hour session followed on from a previous event I ran in March and concludes a short run of events that form part of DH@Manchester.

The aim of the workshop was to inspire researchers at all levels to gain practical experience with tools and techniques in order to go on to develop individual research projects with these or similar collections. Participants did not need any technical experience to join in, other than basic office and web browsing skills. The workshop plan and instructions are available online.

What projects and collections did we look at?

The three activities focused on image searching, analysing text and analysing colour. We looked at projects including the following.

  1. Broadside Ballads Online from the Bodleian Libraries (University of Oxford), a digital collection of English printed ballad-sheets from between the 16th and 20th centuries that includes a feature to search for an image within an image. The collection includes digitised ballad-sheets from The University of Manchester Library’s Special Collections following work by visiting researcher Dr Giles Bergel with the John Rylands Research Institute.
  2. JSTOR Text Analyzer from JSTOR Labs, a beta tool which will identify what any document you give it is about and recommend articles and chapters from JSTOR about the same topics.
  3. Robots Reading Vogue from Yale University Library’s Digital Humanities Lab, a collection of tools to interrogate the text within the entire U.S. Vogue Archive (ProQuest) and its front covers, such as a topic modeller, N-gram viewer and various colour analysis methods.

While developing this workshop, I created a project of my own to visualise the average colour used in the front covers of all full-colour issues from Illustrated London News (Gale Cengage). Just a few short Python scripts were required to extract this information from the collection and display it in an interactive web page. This allowed us to look for trends with particular hues, such as the more common use of reds on December issues.

Digital Humanities Second Library Lab summary

What did we learn?

After each activity we discussed some of the issues raised. (Incidentally, I captured key points on a Smart Kapp digital flipchart or smart whiteboard, continuing the “Digital First” principles that Library colleagues are adopting.)

  • Image analysis and computer vision has many potential applications with library collections, such as identifying where printed or handwritten text occurs in an image, facial recognition, and detecting patterns or differences between different editions or issues within a series.
  • For image analysis systems to work best, the image sets and algorithms will need to be carefully curated and trained. This is a time-consuming process.
  • The text analyser worked quite well but, as with the image search, was not perfect. It is important to find out precisely what “goes wrong” and why.
  • Other applications for the text analysis tool include checking your grant application for any gaps in topics you think should be covered, for checking your thesis development, or for lecturers to check their students’ use of references in submitted papers.
  • Being able to visualise an entire collection in one display (and then dive into the content) can give one an idea of what is there before selecting which physical item to go to the trouble of visiting and retrieving. Whitelaw (2015) suggests that such “generous interfaces” can open up the reader to a broader, less prescriptive view into a collection than the traditional web search.
  • It could be more useful to be able to compare different collections or publications against each other. This can be difficult when multiple licence holders or publishers are involved, with different technical or legal restrictions to address.
  • Programming or other technical skills would need to be learned in order to develop or apply many tools. Alternatively, technical specialists would need to work in partnership with researchers, perhaps utilising the University’s Research IT service or the Library’s Digital Technologies & Services division.

Summary

Digital or computational tools and techniques are increasingly being applied to arts, humanities and social science methods. Many of the collections at The University of Manchester Library have potential for stimulating interdisciplinary research. Such Digital Scholarship projects would often require a greater level of technical knowledge or skill than many research groups might currently possess, so further training or provision for technical support might be necessary.

References

Whitelaw M. (2015). ‘Generous Interfaces for Digital Cultural Collections’, Digital Humanities Quarterly, 2015 9.1, [Online]. Available at http://www.digitalhumanities.org/dhq/vol/9/1/000205/000205.html (Accessed: 25 May 2017)

Going Green: a year of supporting the HEFCE OA Policy

A year since HEFCE linked the mandatory deposit of accepted manuscripts to the allocation of QR funding, this post describes how the Library’s Scholarly Communications Team has helped academic colleagues successfully adapt to the new ruling which has resulted in an unprecedented level of Green OA deposits to our repository.

As we entered 2016, developing support services for the incoming policy was the team’s highest priority. The biggest obstacle was a very low Green OA baseline: from 2013 to early 2016, the number of Manchester-authored papers available through pure or hybrid Gold OA journals rose steadily; yet annual Green OA deposits to our repository stalled at ten percent of the institution’s journal article output over the same period.

During a pilot the previous year a variety of support models were tested with selected schools and we found that the highest compliance was achieved when the responsibility on authors was minimal and the Library offered to deposit and set embargoes on their behalf. So in February, we made a proposal to develop a fully mediated service which was accepted by the University’s senior research governance committee. The task then was to reorient our existing Gold APC service to encompass Green OA workflows which scaled to enable our team of six staff to support deposit of ~6000 papers a year.

To allow authors to send us their manuscripts we created an authenticated deposit form (branded as the Open Access Gateway) with just two input fields: acceptance date and journal/proceedings title, plus a drag and drop area to attach the file. Authors could also use the form to request Gold OA payment if they acknowledged a grant from a qualifying funder.

In the months before the policy launched we worked closely with Library marketing colleagues to deliver a communications campaign which raised awareness of the policy and built momentum behind the new service. Our message to academics was, ‘Just had a paper accepted? Now make sure it’s REF eligible.’

OA A5 flyer cover

In April the policy launched, our deposit form went live, and the firehose to the University’s publications opened. Early signs were promising; in the first weeks, we received roughly ten manuscripts per working day which represented a significant increase against our baseline. However, more persuasion was needed for those authors across campus who weren’t sending us their manuscripts. We therefore began to chase those authors via email and sent a follow-up email, copying in a senior administrator, if we had no response.

We particularly focussed on non-compliant papers with the highest altmetric scores which had an increased likelihood of being selected for REF. The chaser emails were effective and many authors replied immediately with the manuscript attached. Of course, our emails also prompted some authors to ask questions or offer opinions about the policy which required additional resourcing. 

Sending chaser emails significantly raised the institution’s compliance rates, and it was clear that we would need to continue to do this systematically as awareness of the policy gradually spread across campus. This placed additional strain on the team as searching Scopus for papers and chasing authors proved an intensive process. We explored alternatives (e.g. using the ORCID API to identify our new papers) but no automated process was as effective as the painstakingly manual check of Scopus search results.

By August we’d refined our reporting to the University to include school/division level compliance against projections. To achieve this we recorded the compliance status and author affiliations of every single University paper falling within the scope of the policy in a masterfile spreadsheet. We then used SciVal to calculate the average output of the REF-eligible staff from each of the 33 schools/divisions over the past five years. This enabled us to project how many accepted papers we would expect each school/division to have per month during the first twelve months of the policy. Every month we produce a compliance report, like the one below, which supports standing agenda items at research governance committee meetings across the University.

 

REF_OA_report_blog_post
Monthly REF OA Faculty/School compliance report

 

As we moved into the new academic year, monthly deposits continued to rise. The team were at maximum capacity processing incoming manuscripts, so to speed up the flow of papers through our assembly line we purchased a license for the online forms service Typeform and developed a back office ‘If-This-Then-That’ style form. This allowed us to distil esoteric funder and publisher information into a simple workflow enabling papers to be processed with reduced input from higher grade staff.

Now, twelve months on from the introduction of probably the most swingeing OA policy ever introduced, we can take stock of how well we have adapted to the new ruling.  In the absence of a Snowball-type metric for measuring our compliance we currently track compliance three ways:

  • % top papers compliant: In January, the University ran its annual Research Review Exercise, involving research staff proposing their best post-2014 papers for internal review. This provided the first opportunity to gauge compliance of those papers with the highest chance of being returned in the next REF. During the exercise, a total of 1360 papers were proposed for review which were within the scope of the policy, of these a very encouraging 95% were compliant with the new OA requirements. 
  • % compliant against projections: Our chosen metric for reporting compliance to the University. We report the proportion of compliant papers with at least one REF-eligible author against the total number of papers we would have expected to have been accepted by all our REF eligible staff. Against this measure, 68% of the papers are compliant, 7% are non-compliant, and 25% are not currently recorded. Many of these unrecorded papers will not yet be published so we will chase them once they are indexed in Scopus. A large number of our papers arising from the ATLAS/LHCb collaborations are also available via Gold OA and are compliant but we have not yet recorded them in our masterfile.
  • % compliant overall: To date, we’ve recorded 4656 papers of which 4031 (87%) are compliant, 459 (10%) are not compliant, and 166 (3%) are being actively chased by our team.

In total there’s a 55% Green OA/45% Gold OA split, and given that Green OA represents more inconvenience than most of our academic colleagues unfamiliar with arXiv have ever been willing to tolerate, it is very unlikely indeed that the University would have achieved such high compliance had the Library not provided a mediated Green OA deposit service. The data confirms our approach helped make Green Open Access an organisational habit practically overnight. 

The approach has come at a cost however; over the past year, supporting the HEFCE OA policy has taken up the majority of the team’s bandwidth with most of our 9am-5pm conversations being in some way related to a paper’s compliance with one or more funder OA policy.

Now that our current processes have bedded in, and in anticipation of the launch of the new UK Scholarly Communications License (UK-SCL) – for more on this read Chris Banks’s article or watch her UKSG presentation – and further developments from Jisc, we hope that over the next 12 months we can tilt the balance away from this reductionist approach to our scholarly output and focus on other elements of the scholarly communication ecosystem. For example, we are already in discussions with Altmetric about incorporating their tools into our OA workflows to help our academics build connections with audiences and are keen to roll this out soon – from early conversations with academics we think this is something they’re really going to like.

Whatever lies in store, it’s sure to be another busy year for the team.

Exploring digital collections at the first Digital Humanities Library Lab

A new pilot workshop, the first Digital Humanities Library Lab, ran on 3 March 2017. This engaging and informative cross-discipline event offered a dozen researchers the chance to explore and discuss new tools and digital text collections from The University of Manchester Library, inspiring the development of future Digital Humanities computational research methods.

Exploring digital collectionsThe afternoon comprised of three activities.

  1. Spelling and printing variations when searching Jisc Historical Texts
  2. Visualising themes in longform scholarly outputs using the JSTOR Topicgraph tool
  3. A UK-first, beginning to use an API to access previously unavailable content from Adam Matthew Digital’s Mass Observation

The workshop instructions are available online for all to view, and the Library is looking to run a similar event again in May. What would you like to see covered next? Please get in touch with DH@Manchester or the Library’s DH Project Officer Phil Reed directly, or leave a comment below.

Support and seedcorn funding for Faculty of Humanities researchers

The Digital Humanities Project Call 2016-2017 has just been announced. This year DH@Manchester are focusing on developing new projects in two specific areas:

  • innovative projects driving out of the Library’s extensive electronic collections
  • cutting-edge research which can be developed in partnership with colleagues in the School of Computer Science (including text mining, linked data, image processing, and data visualization).

The closing date is Wednesday, 22 March 2017. View the Project Call page for more information.

Supporting student publishing: perspectives from the University of Manchester and beyond

student_publishing_event1

The Manchester perspective, Part 1

For the past couple of years we’ve been giving some thought to the role of university libraries in publishing, in common with other libraries. However, the University of Manchester is home to Manchester University Press (MUP), one of the largest university presses in the UK, so we’ve had to think carefully about how to work collaboratively to make best use of our respective expertise and resources in order to meet the University’s strategic objectives. Our initial thinking and work started in 2014 as part of the Library’s strategic programme, with follow-on projects funded by the University’s Centre for Higher Education Research, Innovation and Learning (CHERIL).

When we started our thinking, we expected that the outcome would likely be some kind of publishing support service, using Open Journal Systems (OJS) for hosting. To develop a tangible offer, we had many discussions about which parts of the support service would naturally sit with the Press and which in the Library, and even more about funding and sustainability. To date, our collaboration has resulted in:

  • development of Manchester Open Library as an imprint of MUP,
  • launch of the James Baldwin Review,
  • development of a student journal for the Manchester Medical School, and
  • development of 3 online learning resources on ‘publishing’,

but not in the publishing support service we originally envisaged. Instead we most recently considered offering a model that we believed would be sustainable with a low level of support, a multi-disciplinary undergraduate journal managed by a postgraduate editorial team. However, when we ran this idea past senior staff from our Humanities faculty and with responsibility for postgraduate researchers (PGRs), there was little appetite for supporting any type of student journal, and since the Library and the Press aim to support the University in achieving its strategic goals we have parked this idea, for now. That said, we do still see value in students experiencing publishing either as authors or as part of an editorial team, which is why we decided to harness the expertise of our Press in the development of online learning modules which anyone on campus with an interest in publishing can access and learn from.

From what we hear about other institutions it seems that our experience is at odds with current trends in support for student publishing, ie, there appear to be many examples of libraries, academics and university presses launching student journals. We’ve been keen to understand if the challenges that have limited our service development are unique to Manchester and to learn more about how other institutions are providing support for student journals. So, as part of our latest CHERIL-funded project (Publishing Research and Learning for Students – PuRLS), we recently held a one day conference on student publishing. We wanted to bring together institutions with experience of student publishing or an interest in student publishing so that we could all learn from each other. The event, held on 16th January 2017, brought together a mixture of librarians, publishers, academic staff, administrative staff and students.

Libraries supporting student journals

Our contributors from the universities of Surrey, Warwick and Edinburgh, and Leeds Beckett University described their involvement with student journals. In all cases journals are run on OJS. At Edinburgh and Warwick, the libraries offer journal hosting services which publish both student and academic-level journals.

Although Edinburgh has a university press, the Library developed the hosting service independently. Angela Laurins, Library Learning Services Manager, explained that the service developed organically and is now well established, providing only set-up support for new journals; thereafter, journals are managed by their own editorial teams. Angela confirmed that this model works well, with minimal resource requirement. In fact, it works so well that she no longer requires a named academic champion for established journals if the previous champion moves on.

Warwick’s service is a more recent development, building on two journals already developed within academic departments and further interest from other areas for more new journals, together with available skills and resource within the Library to develop and manage journals, using OJS’s externally hosted option. Yvonne Budden, Head of Scholarly Communications, talked about two multi-disciplinary journals, Reinvention and Exchanges.

Reinvention, an international journal, is student-led and student-run, with academic support. The main resource requirement is in maintaining high quality. Academic staff carry out peer review and help students improve the standard of their work. Reinvention has received over 460 submissions and published approximately 130 articles. Submissions are split fairly evenly between disciplines and also come from a number of different countries. Yvonne explained the value that the library can bring to publishing is in part “things libraries are known for being good at”, eg, advising on open access, ISSNs, copyright, DOAJ registration, digital preservation, analytics.[presentation-slides-warwick-jan-2017]

Charlotte Barton, an Information Literacy Librarian, talked about her role in supporting the Surrey Undergraduate Research Journal (SURJ). The interdisciplinary journal is published by the Learning Development Team, which comprises librarians and learning advisors, and accepts work marked as 2.1 or higher, as well as reflective accounts, conference reviews and literature reviews. The editorial team is made up of academic staff and PGRs – PGRs stay in this role for a maximum of one year (two journal issues) and carry out peer review as well as other editorial tasks.

Charlotte explained that supporting prospective authors is time-intensive (1-1 support is provided by the SURJ team) but as submission rates are currently low (10 per issue) further work needs to be done on promoting the journal to academic colleagues. Future plans also include working with academic staff to develop training materials, eg, to improve writing skills. [presentation-slides-surrey-jan-2017]

Kirsty Bower, an Academic Librarian at Leeds Beckett University, described how the interest in setting up journals at her institution resulted from Open Access (OA) requirements for the next Research Excellence Framework (REF) and likely requirements of the Teaching Excellence Framework (TEF). An existing Sociology UG journal, Critical Reflections, was moved onto OJS in 2016 following a discussion with the lead academic, who was keen to increase visibility after producing a number of print issues. The journal publishes pieces produced in third year module, in which students apply their sociological knowledge to real life situations, and students are involved in the editorial process. Kirsty reported that despite limited promotion downloads have surpassed expectations, although she acknowledged that it isn’t clear who the readers are. Although the Leeds Beckett team face similar challenges to other institutions (eg, limited staffing resource, limited funding for promotion), they are considering developing a multi-disciplinary journal.  [presentation-slides-leedsbeckett-jan-2017]

Presses supporting student publishing

Our speakers, from UCL Press and White Rose University Press (WRUP), are at very different stages of developing their services for students.

Publishing Manager Lara Speicher explained that at UCL Press student journals are hosted on OJS but run themselves, as long as they have support from their academic department.  Proposals for new journals are not considered without agreement of faculty support – this commitment is vital as UCL Press is too small to provide high levels of support to students. Lara highlighted that it can be difficult to explain the difference between a hosting service and a publishing service, and explained that one journal had expected more ‘hand holding’ from the Press. Providing support for students ties in with UCL’s Connected Curriculum which brings research into learning. UCL Press have recently appointed a new journals manager who has plans for further support, eg, creating a forum for journal teams to meet and share experiences and delivering workshops on the publishing process. [presentation-slides-uclpress-jan-2017]

Tom Grady, Acting Press Manager, told us that WRUP launched in 2016 with the aim of publishing academic journals and books, so when the first journal proposal received was for a student journal there were some concerns. These included whether publishing a student journal would undermine the Press’s aspiration to become a reputable academic publisher, how sustainable a student journal would be, and who would read a student journal. Having since overcome these concerns the Press has recently launched the Undergraduate Journal of Politics and International Relations, which has an academic lead and funding sources, represents a gap in the market, and gives students the opportunity to be published authors or to be part of the editorial team. [presentation-slides-wrup-jan-2017]

The Manchester perspective, Part 2

We invited a number of speakers connected with the University of Manchester to contribute to the event, to increase awareness of potential challenges or opportunities for institutions considering dissemination of student research as a means to enhance the student experience.

The key driver when we were considering supporting student journal came from the Manchester Medical School, and particularly from a group of students, including Josh Burke. Josh explained that one reason for wanting to set up a journal was that medical students get points for publishing work in journals that are indexed in PubMed that count in applications for their first post. The group believed that they could set up a journal themselves but sought support from academic staff, who put them in touch with us. We provided access to OJS and publishing expertise from MUP; the students developed a staged peer review system and brought a lot of energy to the initiative, which resulted in the launch of Manchester Medical Journal (MMJ) in late 2016. MMJ is student-led and student-run. Josh admitted that using OJS was a pain point, as the peer review system developed doesn’t work easily within the OJS workflows, and that the student group had been naïve about the complexity of setting up and running a journal, needing academic support, publishing guidance and financial support. With the backing of the Medical School and continued investment of the group of students who initially set up the journal, MMJ seems likely to have a future. However, the main challenge is convincing students to publish in a new journal that isn’t indexed in PubMed. [presentation-slides-burke-jan-2017]

A similar view is shared by senior academic and administrative staff at Manchester, particularly in relation to PGRs. We asked Professor Maja Zehfuss, Associate Dean for PGR in the Faculty of Humanities, to outline this position at the event. The key points she made were that at Manchester institutional journals are not considered to be right for PGR publications, that PGRs should be seeking to publish papers of at least 3* ranking in ‘grown-up’ journals, that submitting papers to established journals provides a tough learning experience for PGRs which develops resilience and skills, and she queried what student journals are for and who reads them.

Of course, journals are only one means of scholarly communication, and at Manchester academic staff are incorporating different forms within their modules. Dr John Zavos, a course leader from Religions and Theology, explained that he was keen on openness in research and wanted to develop resources that would put his students’ work in the public domain, eg, ‘Poppy Hijab’, an exhibit on the Museum of the South Asian Diaspora blog. John is now leading a CHERIL-funded project exploring impactful public-facing platforms and hopes to incorporate editorial management of a blog into his Level Two course to provide further opportunities for publishing experience.

To conclude the event Simon Bains, our Deputy Librarian and Head of Research Support, and Meredith Carroll, Journals Manager from MUP, described our experience, which is summarised in the first part of this piece.  [presentation-slides-manchester-jan-2017] For now, our support for student publishing takes the form of a recently-launched blog, The Publishing Exchange, to encourage reflection and learning, and My Research Essentials online resources, all available under the CC-BY-NC licence:

A new professional’s view: UKSG Annual Conference 2015

Open Access iconGlasgow was the venue for the 2015 UKSG Conference and Helen Dobson and I headed north hoping for a repeat of the glorious weather in 2012. We were welcomed by torrential downpours and strong winds, but still had a great time.  The UKSG conference is a good opportunity to hear current debate and learn about innovative practice on the issues affecting libraries and scholarly communications, and a valuable forum for meeting  publishers.  I was lucky enough to win a sponsored place to attend this year – thanks, Springer and Sage!

Once again, Open Access (OA) was a key topic so there was much to interest us in the plenary and breakout sessions.   We were stimulated by Geoffrey Bilder’s introduction, recognising the pressure to publish facing authors and interested in the notion that universities might get better results from their researchers by measuring less and demanding fewer papers. Bilder commented that ‘the primary motivation [for publishing research] is to get credit for stuff, not to document it or provide evidence’, a situation he described as ‘dispiriting’, comparing citations to ‘a scholarly form of the Like button’.

“Citations are a scholarly form of the Like button.”

Throughout the conference, Helen and I were keen to attend events addressing issues and problems which we are all facing, including OA workflows, offsetting deals, and how to build trust in new metrics. The University of Manchester Library is already engaging with these issues on a local and national level so we were interested in the experiences and views shared, and particularly in hearing more about longer-term solutions such as JiscMonitor and the experiences of the institutions involved in the Jisc-ARMA ORCID project.

Counting the costs of open accessOne of the most useful breakout sessions was the panel discussion on ‘Engaging researchers on stakeholder perspectives,’ which prompted some lively debate and much follow-up thought and questions. Summarising the recent Research Consulting report on Counting the costs of Open Access, UCL’s Paul Ayris spoke of the high up-front costs faced by institutions until Open Access workflows and tools become more sophisticated, with smaller or less research-intensive institutions bearing a disproportionate cost burden.

Robert Kiley provided an update on The Wellcome Trust’s stance on Gold OA, emphasising that the Trust remains happy to fund Article Processing Charges (APCs) but only if publishers will honour agreements. We know from our own experiences that papers we have paid APCs for aren’t always OA or published under licences required by funders.  Kiley reported that 34% of Wellcome Trust-funded APCs do not have the required CC-BY licence, and 13% have not been deposited into PubMed Central. This means that  £0.5million of OA funding has been spent on papers that are not compliant with Wellcome Trust’s OA policy.

Elsevier’s Alicia Wise was also on the panel and explained that the cost of APCs has been reduced for some Elsevier titles.  We confess we haven’t really noticed this to date, as the average cost of our Elsevier APCs for 2014-2015 stands at around £1,969, higher than our 2013-2014 non-prepayment deal figure of £1,909.  Alicia stated that Elsevier don’t double-dip but explained that it was difficult to evidence this due to confidentiality clauses or to find a cost-based pricing model acceptable to librarians (who favour calculations based on quantity of content) and publishers (who base calculations on journal value).

The interactive session on OAWAL, Open Access Workflows in Academic Libraries, offered reassurance that different institutions experience similar problems to those we face. Speakers Graham Stone from Huddersfield and Jill Emery from Portland State emphasised that the principle of OAWAL is to gather a wide range of methods and experiences; not to enforce one ‘right’ approach but to allow librarians to tailor workflows to suit their institution. However, I felt that the group activity of identifying challenges, ideal scenarios and possible solutions suggested that a collective approach can highlight what needs to change around OA to streamline processes  The Jisc opeNWorks project we are leading has already highlighted to us the need for community solutions at a regional and national level.  As we prepare for the next major challenge of HEFCE compliance, we see that the issues we face are at a scale greater than an individual institution, meaning solutions will be found within the community, not in isolation.

OUP_cakes
Networking with publishers at UKSG

It was useful (and fun – excellent cupcakes, Oxford University Press!) to catch up with publishers with whom we have existing institutional deals, and with whom we may consider arranging a deal in the future, asking questions about how these function and clarifying workflows. We discussed an institutional deal being developed by OUP and saw  the pilot dashboard designed to simplify processes for our authors, and we dropped by the Taylor & Francis stand to ask for clarification on  how this publisher’s offsetting deal will work for authors. It was also really positive to hear publishers such as BMJ and IEEE considering how they can help institutions with HEFCE requirements, for example, by supporting Jisc’s Publication Router or providing authors with the AAM in acceptance emails; we would love to see more examples of this.

Invisible Humanities?

The conference ended strongly with its closing talks as engaging as its opening salvo. As always, it was a joy to hear articulate and passionate Open Access advocate Martin Eve speak on how ‘mega journals’ might be funded to address the risk of the Humanities becoming invisible to the public through competition for funding (true of subscription costs as well as APCs) with the Sciences. The University of Manchester Library supports the Open Library of Humanities and we look forward to seeing further development.

We left Glasgow reflecting on how far we’ve come in addressing OA challenges at an institutional level to date and feeling excited that we’re involved in tackling remaining problems via the Jisc Pathfinder projects, an RLUK group focusing on publisher OA workflows, and by participating in a UKSG seminar on offsetting schemes in late May.

Liquid Lunch Business Data Service

Lifting the curtain on specialist business and financial databases

Liquid lunches

Several times each year, The University of Manchester Library staff are invited to take part in a “Liquid Lunch”.  Held at lunchtime, as you’d expect, this is a chance for staff to hear from speakers from around or outside the University on a variety of different perspectives and new ideas.  Those attending bring their own sandwiches and drinks are available.

Most speakers are external to the Library, but in March my colleagues in the Business Data Service (BDS) and I had a chance to present to the group.  The topic?  Specialist business and financial databases: Behind the curtain.

 

Specialist financial databases visualised as a tube map

With 30 interested colleagues from across the Library in attendance, we were happy to share some introductory information about how we help users of business databases get access to the data they need.

Xia Hong described the different coverage areas of BDS, including identification of existing Library resources, investigation of new resources, managing of databases, and advice and best practice on working with large datasets.

Phil Reed showed the topological tube network he’d made, explaining that some databases can only be accessed from certain computers in the Library, some can be accessed anywhere on-campus at Manchester, and still others can be accessed from anywhere with an Internet connection.  With over 50 databases covering business and management, choosing the right one is not always straightforward, and we work to help users find the right database.

We were joined by our colleague Jane Marshall, Academic Engagement Librarian for Manchester Business School (MBS).  Jane is the first point of contact for our academic colleagues, and she described her work arranging for curriculum-linked library or database training.

Bloomberg BESS screen 2Earthquakes and equity

Brian Hollingsworth gave a demonstration of Bloomberg.  He showed some of the most-used areas such as share price data, alongside some of the unexpected areas of information such as earthquake data.  Brian holds Equity Essentials certification in the Bloomberg Essentials Training Program, and it was easy to see the depth of his knowledge about what Bloomberg offers and how to access it.  Students who pursue certification will find it increases their skills in working with Bloomberg and also may make them more attractive to prospective employers.

As the newest member of the team, having joined in January, what did I do?  I posed example questions from students, academic staff and non-academic staff; these questions were answered by the others in the team.  Knowing the kinds of questions that come to us helped our colleagues understand more about what we do.

Engaging with our colleagues

Initial feedback on the presentation was positive.  Some questions posed, such as those about how we evaluate our services and how the MBS redevelopment will enhance our services, provide good food for further thinking and development.  Our colleagues were enthusiastic about Bloomberg and all the information it holds; we may see many new users coming to our Bloomberg terminals!

Planning and delivering the presentation gave our team a chance to consider what we though our colleagues might not already know that we’d like them to know.  We would recommend this type of presentation to other teams at Manchester and beyond.  What might be hiding behind a curtain near you?

OA Advocacy banner

Open Access Advocacy

Open Access iconOff we go, Lucy and I, out into the Yorkshire cold to attend an Open Access (OA) advocacy event held at The University of Bradford. We are warmly welcomed to a great afternoon with engaging speakers, and a fun exercise from the man behind the OA innovations at the University of Huddersfield, Graham Stone.

Deadly diseases and healthy profits

Our first speaker was Professor Charles Oppenheim who shared an overview of OA and its importance for academic libraries. He opened with some punchy headlines (which had all the delegates mumbling in their seats) about the monopoly of publishers and their reluctance to share scholarly work for free – using the Ebola crisis as an example. Some publishers have been withholding integral research on the subject unless subscriptions and fees are paid; stating even with a terrible crisis developing there was still the need for a ‘healthy’ profit margin. This led very nicely to his second headline that Elsevier had made a bigger profit in the past 12 months than Google! Despite these alarming headlines, he did emphasise that he was not anti-publisher but there are still some important guidelines that needed to be solidified, using the help of government mandates and institutional gumption!

OA Advocacy 1 - To return to the role of llibraries

Professor Oppenheim described the uptake of OA has been slow and hesitant over the past few years, with only 20-30% of research output in the country being OA. He highlighted the responsibility on the funders and institutions to encourage authors to engage with OA. This could be through different means like incentives, as well as a few sticks, and to directly quote our second speaker Nick Sheppard:

‘Carrots don’t work, please give me a stick.’

OA Advocacy 2 - What I learned

Nick Sheppard and Jennie Wilson from Leeds Beckett summarised their technical challenges and new workflows. Nick took us through the difficulties of advocating green OA in response to the HEFCE announcement and the jump for institutions to embrace new software and repository infrastructure. Nick went on to highlight the importance of social media and altmetrics in drawing attention to the importance of OA and the academic world, and the impact on citations.

OA Advocacy 3 - Other tools & network effects

Jennie gave the audience a glimpse into the new pressures she has dealt with.  Leeds Beckett didn’t receive any RCUK funding so her team had to come up with innovative strategies to encourage authors to engage with OA and explained their use of social media to promote services and encourage authors to deposit in their repository. Their use of LibGuides really had the room buzzing. Jennie explained how they used Twitter feeds about hot topics, such as World Diabetes Day, to capture articles relevant to the discussions. and had a rolling feed on their LibGuide website. This turned out to be an effective incentive for authors to deposit their papers, as well as a way to showcase research taking place at their institute, a ‘win-win’ all round!

OA Advocacy 4 - Using Symplectic & Libguides

Our final speaker was the enthusiastic Graham Stone. We were introduced to the OAWAL project (pronounced like the bird!), a new initiative sourcing workflows and best practices for the OA community which aspires to develop into the ‘go to’ place for management of OA in institutions.

OA Advocacy 5 - What is OAWAL?

Graham then led an exercise to highlight the negatives and positives we face in the OA world. In groups we figured out ways of resolving the issues, highlighting the top 3 priorities. In our group the negatives were things like lack of consistency among publishers, staffing and money. We did find some good positives such as strong mandates and buy in and enthusiasm, and high profile support and advocacy. We came up with a few solutions such as having a more collaborative approach, more mandates and the use of ‘sticks’. Our top 3 priorities were 1) a more collaborative approach supported by mandates, 2) publisher consistency and 3) encouraging academics to refuse to carry out peer-review for publishers that don’t allow authors to comply with funder policies. This exercise was useful as it highlighted that everyone seems to be dealing with the same issues and having the same pain points, and that there is a community out there who can provide advice, personal experience and hopefully a network on best practice and standards. Developing communities and sharing experience is also a focus for Manchester as the lead institution on the opeNWorks JISC pathfinder project.

OA Advocacy 6 - workshop

Lucy and I really enjoyed the session and thought the choices of speakers were well thought out and varied. There were a few questions and answers and an opportunity to network with colleagues in similar roles, so all in all a useful session.