Awareness of Open Access (OA) and Open Data have increased substantially over the last few years, with new mandates and funder policies increasing the levels of OA at The University of Manchester for 2016-17 to 75%. Whilst this is a huge improvement on historic levels of approximately 10% Green OA, the emphasis on compliance with funder requirements has meant that many of the underlying reasons for working openly can be forgotten, presenting a risk that OA starts to be seen as another box to tick. For Open Research to become the norm across academia, major cultural change is required, and most researchers require strong incentives to make that change. In order to help counter the focus on compliance the Library is hosting an Open Research Forum at the Contact Theatre on Thursday 26 October, as part of Open Access Week 2017.
In Classical times the forum was a place where news was exchanged and ideas thrashed out, and it is that spirit of open debate which we are hoping to capture through the Open Research Forum. We have a great selection of researchers lined up from across the University who will be speaking about the issues, challenges and benefits of openness, and what it means to be an ‘open researcher’. In keeping with Open Access Week 2017, the theme for the event is ‘Open in order to…’, focusing on the practical outcomes of working openly. Topics include preprints, OA as part of wider public engagement, and newly emerging data labs which actively re-use data created by other researchers.
The Library as a Broker
Whilst the Library is coordinating the event it will be researcher-led and -focused with a series of slide-free, story-based talks from academics complemented with interactive activities and discussion. Our speakers represent a range of disciplines and we hope to capitalise on the Library being a ‘neutral’ space on campus to encourage exchange from across the Schools. Speakers and participants are encouraged to be honest about their experiences with, and ideas about the future of, open research. We hope that by bringing researchers together to focus on open research without reference to mandates or policies we can help facilitate a more inspiring and substantive discussion on the opportunities and consequences created by researching in an open manner.
Learning from each other
As service providers in a central cultural institution, it’s easy to get lost in the mechanics of how to make research open and in our enthusiasm for this new mode of scholarly communication, and lose sight of how these changes affect researchers’ day to day lives. Thus, as organisers we are hoping to learn lots from our speakers so we can make our services more relevant. The speakers are all actively ‘open researchers’ in different ways so we hope that other researchers can learn from their example and be inspired.
We are involved in an international collaborative project to assess the quality of the Research Data Management training across institutions. This post reports on progress of the project so far, it originally appeared on the project blog on 6th October 2017.
When developing new training programmes, one often asks oneself a question about the quality of training. Is it good? How good is it? Trainers often develop feedback questionnaires and ask participants to evaluate their training. However, feedback gathered from participants attending courses does not answer the question how good was this training compared with other training on similar topics available elsewhere. As a result, improvement and innovation becomes difficult. So how to objectively assess the quality of training?
In this blog post we describe how, by working collaboratively, we created tools for objective assessment of RDM training quality.
In order to objectively assess something, objective measures need to exist. Being unaware of any objective measures for benchmarking of a training programme, we asked Jisc’s Research Data Management mailing list for help. It turned out that a lot of resources with useful advice and guidance on creation of informative feedback forms was readily available, and we gathered all information received in a single document. However, none of the answers received provided us with the information we were looking for. To the contrary, several people said they would be interested in such metrics. This meant that objective metrics to address the quality of RDM training either did not exist, or the community was not aware of them. Therefore, we decided to create RDM training evaluation metrics.
Cross-institutional and cross-national collaboration
For metrics to be objective, and to allow benchmarking and comparisons of various RDM courses, they need to be developed collaboratively by a community who would be willing to use them. Therefore, the next question we asked Jisc’s Research Data Management mailing list was whether people would be willing to work together to develop and agree on a joint set of RDM training assessment metrics and a system, which would allow cross-comparisons and training improvements. Thankfully, the RDM community tends to be very collaborative, which was the case also this time – more than 40 people were willing to take part in this exercise and a dedicated mailing list was created to facilitate collaborative working.
Agreeing on the objectives
To ensure effective working, we first needed to agree on common goals and objectives. We agreed that the purpose of creating the minimal set of questions for benchmarking is to identify what works best for RDM training. We worked with the idea that this was for ‘basic’ face-to-face RDM training for researchers or support staff but it can be extended to other types and formats of training session. We reasoned that same set of questions used in feedback forms across institutions, combined with sharing of training materials and contextual information about sessions, should facilitate exchange of good practice and ideas. As an end result, this should allow constant improvement and innovation in RDM training. We therefore had joint objectives, but how to achieve this in practice?
Deciding on common questions to be asked in RDM training feedback forms
In order to establish joint metrics, we first had to decide on a joint set of questions that we would all agree to use in our participant feedback forms. To do this we organised a joint catch up call during which we discussed the various questions we were asking in our feedback forms and why we thought these were important and should be mandatory in the agreed metrics. There was lots of good ideas and valuable suggestions. However, by the end of the call and after eliminating all the non-mandatory questions, we ended up with a list of thirteen questions, which we thought were all important. These however were too many to be asked of participants to fill in, especially as many institutions would need to add their own institution-specific feedback questions.
In order to bring down the number of questions which should be made mandatory in feedback forms, a short survey was created and sent to all collaborators, asking respondents to judge how important each question was (scale 1-5, 1 being ‘not important at all that this question is mandatory’ and 5 being ‘this should definitely be mandatory’.). Twenty people participated in the survey. The total score received from all respondents for each question were calculated. Subsequently, top six questions with the highest scores were selected to be made mandatory.
Ways of sharing responses and training materials
We next had to decide on the way in which we would share feedback responses from our courses and training materials themselves . We unanimously decided that Open Science Framework (OSF) supports the goals of openness, transparency and sharing, allows collaborative working and therefore is a good place to go. We therefore created a dedicated space for the project on the OSF, with separate components with the joint resources developed, a component for sharing training materials and a component for sharing anonymised feedback responses.
With the benchmarking questions agreed and with the space created for sharing anonymised feedback and training materials, we were ready to start collecting first feedback for the collective training assessment. We also thought that this was also a good opportunity to re-iterate our short-, mid- and long-term goals.
Our short-term goal is to revise our existing training materials to incorporate the agreed feedback questions into RDM training courses starting in the Autumn 2017. This would allow us to obtain the first comparative metrics at the beginning of 2018 and would allow us to evaluate if our designed methodology and tools are working and if they are fit for purpose. This would also allow us to iterate over our materials and methods as needed.
Our mid-term goal is to see if the metrics, combined with shared training materials, could allow us to identify parts of RDM training that work best and to collectively improve the quality of our training as a whole. This should be possible in mid/late-2018, allowing time to adapt training materials as result of comparative feedback gathered at the beginning of 2018 and assessing whether training adaptation resulted in better participant feedback.
Our long-term goal is to collaboratively investigate and develop metrics which could allow us to measure and monitor long-term effects of our training. Feedback forms and satisfaction surveys immediately after training are useful and help to assess the overall quality of sessions delivered. However, the ultimate goal of any RDM training should be the improvement of researchers’ day to day RDM practice. Is our training really having any effects on this? In order to assess this, different kinds of metrics are needed, which would need to be coupled with long-term follow up with participants. We decided that any ideas developed on how to best address this will be also gathered in the OSF and we have created a dedicated space for the work in progress.
When reflecting on the work we did together, we all agreed that we were quite efficient. We started in June 2017, and it took us two joint catch up calls and a couple of email exchanges to develop and agree on joint metrics for assessment of RDM training. Time will show whether the resources we create will help us meet our goals, but we all thought that during the process we have already learned a lot from each other by sharing good practice and experience. Collaboration turned out to be an excellent solution for us. Likewise, our discussions are open to everyone to join, so if you are reading this blog post and would like to collaborate with us (or to follow our conversations), simply sign up to the mailing list.
We’ve now assessed all applications for our sponsored OpenCon 2017 place and are pleased to announce that the successful applicant is Rachael Ainsworth. Rachael is a Research Associate in the School of Physics and Astronomy and the Open Science Champion for the Interferometry Centre of Excellence at the Jodrell Bank Centre for Astrophysics. In this role she promotes, advocates and organises events relating to open science in astronomy but she’s also behind the creation of the Manchester branch of XX+Data – a networking community for women who work with or love data – and been selected as a Mozilla Open Leader and will receive mentorship and training through the Mozilla Network on a project designed to advance open research.
Her project, ‘Resources for Open Science in Astronomy’ (ROSA), aims to collaboratively compile and tailor open science best practices from around the web into a kit for astronomers to work openly from proposal to publication, and will equip senior researches with a single resource so that they can mentor the next generation of open science practitioners . The project will also produce a general open science resource toolkit to encourage adaptation and reuse for any field of research, which will benefit all departments of the University.
Rachael was keen to attend OpenCon because she believes that open and reproducible research is fundamental to the scientific method and that attendance will aid her development: “OpenCon will make me a more confident advocate and allow me to disseminate these tools more effectively within my department and throughout the University in order to empower other researchers with the skills to work openly.”
We look forward to hearing more from Rachael as part our Open Access Week activities (on which, more soon!) and when she shares her OpenCon experience in a blog post later in the year. We also look forward to engaging more with the applicants who weren’t successful on this occasion, facilitating further opportunities to bring advocates of open research together. We’re feeling quite excited about the energy and passion we sensed in all our applicants and we expect them all to make progress in the quest for open!
We’re excited to be sponsoring a University of Manchester PhD student or early career researcher with a passion for Open Research to attend OpenCon 2017 in Berlin, from 11th-13th November.
OpenCon is organised by SPARC, the Right to Research Coalition and a global conference committee. The event brings together early career researchers and scholars from around the world in a positive and supportive environment (see Code of Conduct) to showcase projects, discuss issues and explore ways to advance Open Access, Open Data and Open Education.
Attendees learn more about Open Research issues, develop critical skills, contribute to collaborative projects and meet members of a growing global community advocating for a more open system of sharing the world’s information.
The travel scholarship covers the cost of the registration fee, flight and shared accommodation. The University Library will reimburse the cost of sundries not covered by the scholarship. In return we’ll ask the successful applicant to contribute to one of the Library’s upcoming Open Research events and write up their conference experience in a short report for our blog.
To apply, please submit answers to the following questions by email, using the Subject header ‘OpenCon Application’, to email@example.com. The deadline for submissions is 5pm on Monday 25th September 2017.
Why are you interested in OpenCon?
What are your ideas for advancing Open Research?
How will attending OpenCon help you advance Open Research at the University of Manchester?
We’ll review applications and contact all candidates by the end of September.
The beginning of April marked the end of the fourth year of RCUK’s Open Access (OA) policy. We submitted our finance and compliance report in May and have made our 2016-17 APC data available via the University’s institutional repository, Pure.
The headlines for us from this period are:
We have estimated 75% compliance for 2016-17 (54% Gold OA and 21% Green OA).
This is a significant increase in Green OA. In part this is due to the launch of HEFCE’s OA policy but it is also a consequence of the constraints of the block grant, ie, we have been unable to meet demand for Gold OA during the reporting period.
Despite the increase in Green OA, expenditure on Gold OA has not decreased. This is partly due to publishers that do not provide a compliant Green OA option but increased APC unit level costs are also a factor.
We have reported an 18% increase in the average APC cost in 2016/17 (£1869) against the 2015/16 average (£1578). To some extent this increase can be accounted for by foreign exchange rate differences.
Although we operate a ‘first come, first served’ model for allocating the block grant, it was necessary to impose restrictions for 3 months of this period. We limited expenditure to Pure Gold OA journals, non-OA publication fees and hybrid journals that do not provide a compliant Green OA option.
The level of Gold OA achieved has only been possible due to continued investment from the University (£0.2m) and credits/discounts received from publishers relating to subscription packages and offsetting deals (£0.1m).
We arranged Gold OA with 60 different publishers. Of these, we managed offsetting schemes and memberships with 11 and arranged Gold OA for only one paper with 20.
We continued to assess publisher deals to obtain best value from the block grant but are committed to engaging only with publishers that offer a reasonable discount and overall fair OA offer.
As in previous years, most APCs were paid to Elsevier (139), almost double the number paid to the next publisher, Wiley (75).
As in previous years, our highest cost APC (£4679) was paid to Elsevier. The lowest cost APC (£196) was paid to the Electrochemical Society.
We reported expenditure of £72,297 on ‘other costs’. This amount includes colour and page charges as well as publication fees associated with Green OA papers.
Despite reminders to authors that papers must be published as CC-BY, 8 papers were published under non-compliant licences and we were unable to identify licences for a further 16 papers. We contact publishers to correct licences when we are aware of a non-compliant licence.
We continued to see engagement with Gold OA from Humanities researchers who produce outputs other than journal articles. We have supported Gold OA for one monograph and one book chapter during the reporting period, at a cost of £11,340 from the block grant. A further monograph has been paid for from an institutional OA fund.
Despite a concerted effort on our part we continued to see inconsistency in the inclusion of grant acknowledgements on papers. We act in good faith when approving payment from the block grant but believe a joined up approach from RCUK, institutions and publishers is needed to ensure all researchers are aware and fulfil this requirement consistently.
This month I delivered Digital Humanities Second Library Lab, a hands-on showcase of digital library collections and tools created for the purpose of innovative research using computational methods. This three-hour session followed on from a previous event I ran in March and concludes a short run of events that form part of DH@Manchester.
The aim of the workshop was to inspire researchers at all levels to gain practical experience with tools and techniques in order to go on to develop individual research projects with these or similar collections. Participants did not need any technical experience to join in, other than basic office and web browsing skills. The workshop plan and instructions are available online.
What projects and collections did we look at?
The three activities focused on image searching, analysing text and analysing colour. We looked at projects including the following.
JSTOR Text Analyzer from JSTOR Labs, a beta tool which will identify what any document you give it is about and recommend articles and chapters from JSTOR about the same topics.
Robots Reading Vogue from Yale University Library’s Digital Humanities Lab, a collection of tools to interrogate the text within the entire U.S. Vogue Archive(ProQuest) and its front covers, such as a topic modeller, N-gram viewer and various colour analysis methods.
While developing this workshop, I created a project of my own to visualise the average colour used in the front covers of all full-colour issues from Illustrated London News (Gale Cengage). Just a few short Python scripts were required to extract this information from the collection and display it in an interactive web page. This allowed us to look for trends with particular hues, such as the more common use of reds on December issues.
What did we learn?
After each activity we discussed some of the issues raised. (Incidentally, I captured key points on a Smart Kapp digital flipchart or smart whiteboard, continuing the “Digital First” principles that Library colleagues are adopting.)
Image analysis and computer vision has many potential applications with library collections, such as identifying where printed or handwritten text occurs in an image, facial recognition, and detecting patterns or differences between different editions or issues within a series.
For image analysis systems to work best, the image sets and algorithms will need to be carefully curated and trained. This is a time-consuming process.
The text analyser worked quite well but, as with the image search, was not perfect. It is important to find out precisely what “goes wrong” and why.
Other applications for the text analysis tool include checking your grant application for any gaps in topics you think should be covered, for checking your thesis development, or for lecturers to check their students’ use of references in submitted papers.
Being able to visualise an entire collection in one display (and then dive into the content) can give one an idea of what is there before selecting which physical item to go to the trouble of visiting and retrieving. Whitelaw (2015) suggests that such “generous interfaces” can open up the reader to a broader, less prescriptive view into a collection than the traditional web search.
It could be more useful to be able to compare different collections or publications against each other. This can be difficult when multiple licence holders or publishers are involved, with different technical or legal restrictions to address.
Programming or other technical skills would need to be learned in order to develop or apply many tools. Alternatively, technical specialists would need to work in partnership with researchers, perhaps utilising the University’s Research IT service or the Library’s Digital Technologies & Services division.
Digital or computational tools and techniques are increasingly being applied to arts, humanities and social science methods. Many of the collections at The University of Manchester Library have potential for stimulating interdisciplinary research. Such Digital Scholarship projects would often require a greater level of technical knowledge or skill than many research groups might currently possess, so further training or provision for technical support might be necessary.
A year since HEFCE linked the mandatory deposit of accepted manuscripts to the allocation of QR funding, this post describes how the Library’s Scholarly Communications Team has helped academic colleagues successfully adapt to the new ruling which has resulted in an unprecedented level of Green OA deposits to our repository.
As we entered 2016, developing support services for the incoming policy was the team’s highest priority. The biggest obstacle was a very low Green OA baseline: from 2013 to early 2016, the number of Manchester-authored papers available through pure or hybrid Gold OA journals rose steadily; yet annual Green OA deposits to our repository stalled at ten percent of the institution’s journal article output over the same period.
During a pilot the previous year a variety of support models were tested with selected schools and we found that the highest compliance was achieved when the responsibility on authors was minimal and the Library offered to deposit and set embargoes on their behalf. So in February, we made a proposal to develop a fully mediated service which was accepted by the University’s senior research governance committee. The task then was to reorient our existing Gold APC service to encompass Green OA workflows which scaled to enable our team of six staff to support deposit of ~6000 papers a year.
To allow authors to send us their manuscripts we created an authenticated deposit form (branded as the Open Access Gateway) with just two input fields: acceptance date and journal/proceedings title, plus a drag and drop area to attach the file. Authors could also use the form to request Gold OA payment if they acknowledged a grant from a qualifying funder.
In the months before the policy launched we worked closely with Library marketing colleagues to deliver a communications campaign which raised awareness of the policy and built momentum behind the new service. Our message to academics was, ‘Just had a paper accepted? Now make sure it’s REF eligible.’
In April the policy launched, our deposit form went live, and the firehose to the University’s publications opened. Early signs were promising; in the first weeks, we received roughly ten manuscripts per working day which represented a significant increase against our baseline. However, more persuasion was needed for those authors across campus who weren’t sending us their manuscripts. We therefore began to chase those authors via email and sent a follow-up email, copying in a senior administrator, if we had no response.
We particularly focussed on non-compliant papers with the highest altmetric scores which had an increased likelihood of being selected for REF. The chaser emails were effective and many authors replied immediately with the manuscript attached. Of course, our emails also prompted some authors to ask questions or offer opinions about the policy which required additional resourcing.
Sending chaser emails significantly raised the institution’s compliance rates, and it was clear that we would need to continue to do this systematically as awareness of the policy gradually spread across campus. This placed additional strain on the team as searching Scopus for papers and chasing authors proved an intensive process. We explored alternatives (e.g. using the ORCID API to identify our new papers) but no automated process was as effective as the painstakingly manual check of Scopus search results.
By August we’d refined our reporting to the University to include school/division level compliance against projections. To achieve this we recorded the compliance status and author affiliations of every single University paper falling within the scope of the policy in a masterfile spreadsheet. We then used SciVal to calculate the average output of the REF-eligible staff from each of the 33 schools/divisions over the past five years. This enabled us to project how many accepted papers we would expect each school/division to have per month during the first twelve months of the policy. Every month we produce a compliance report, like the one below, which supports standing agenda items at research governance committee meetings across the University.
As we moved into the new academic year, monthly deposits continued to rise. The team were at maximum capacity processing incoming manuscripts, so to speed up the flow of papers through our assembly line we purchased a license for the online forms service Typeform and developed a back office ‘If-This-Then-That’ style form. This allowed us to distil esoteric funder and publisher information into a simple workflow enabling papers to be processed with reduced input from higher grade staff.
Now, twelve months on from the introduction of probably the most swingeing OA policy ever introduced, we can take stock of how well we have adapted to the new ruling. In the absence of a Snowball-type metric for measuring our compliance we currently track compliance three ways:
% top papers compliant: In January, the University ran its annual Research Review Exercise, involving research staff proposing their best post-2014 papers for internal review. This provided the first opportunity to gauge compliance of those papers with the highest chance of being returned in the next REF. During the exercise, a total of 1360 papers were proposed for review which were within the scope of the policy, of these a very encouraging 95% were compliant with the new OA requirements.
% compliant against projections: Our chosen metric for reporting compliance to the University. We report the proportion of compliant papers with at least one REF-eligible author against the total number of papers we would have expected to have been accepted by all our REF eligible staff. Against this measure, 68% of the papers are compliant, 7% are non-compliant, and 25% are not currently recorded. Many of these unrecorded papers will not yet be published so we will chase them once they are indexed in Scopus. A large number of our papers arising from the ATLAS/LHCb collaborations are also available via Gold OA and are compliant but we have not yet recorded them in our masterfile.
% compliant overall: To date, we’ve recorded 4656 papers of which 4031 (87%) are compliant, 459 (10%) are not compliant, and 166 (3%) are being actively chased by our team.
In total there’s a 55% Green OA/45% Gold OA split, and given that Green OA represents more inconvenience than most of our academic colleagues unfamiliar with arXiv have ever been willing to tolerate, it is very unlikely indeed that the University would have achieved such high compliance had the Library not provided a mediated Green OA deposit service. The data confirms our approach helped make Green Open Access an organisational habit practically overnight.
The approach has come at a cost however; over the past year, supporting the HEFCE OA policy has taken up the majority of the team’s bandwidth with most of our 9am-5pm conversations being in some way related to a paper’s compliance with one or more funder OA policy.
Now that our current processes have bedded in, and in anticipation of the launch of the new UK Scholarly Communications License (UK-SCL) – for more on this read Chris Banks’s article or watch her UKSG presentation – and further developments from Jisc, we hope that over the next 12 months we can tilt the balance away from this reductionist approach to our scholarly output and focus on other elements of the scholarly communication ecosystem. For example, we are already in discussions with Altmetric about incorporating their tools into our OA workflows to help our academics build connections with audiences and are keen to roll this out soon – from early conversations with academics we think this is something they’re really going to like.
Whatever lies in store, it’s sure to be another busy year for the team.
A new pilot workshop, the first Digital Humanities Library Lab, ran on 3 March 2017. This engaging and informative cross-discipline event offered a dozen researchers the chance to explore and discuss new tools and digital text collections from The University of Manchester Library, inspiring the development of future Digital Humanities computational research methods.
For the past couple of years we’ve been giving some thought to the role of university libraries in publishing, in common with other libraries. However, the University of Manchester is home to Manchester University Press (MUP), one of the largest university presses in the UK, so we’ve had to think carefully about how to work collaboratively to make best use of our respective expertise and resources in order to meet the University’s strategic objectives. Our initial thinking and work started in 2014 as part of the Library’s strategic programme, with follow-on projects funded by the University’s Centre for Higher Education Research, Innovation and Learning (CHERIL).
When we started our thinking, we expected that the outcome would likely be some kind of publishing support service, using Open Journal Systems (OJS) for hosting. To develop a tangible offer, we had many discussions about which parts of the support service would naturally sit with the Press and which in the Library, and even more about funding and sustainability. To date, our collaboration has resulted in:
development of Manchester Open Library as an imprint of MUP,
development of a student journal for the Manchester Medical School, and
development of 3 online learning resources on ‘publishing’,
but not in the publishing support service we originally envisaged. Instead we most recently considered offering a model that we believed would be sustainable with a low level of support, a multi-disciplinary undergraduate journal managed by a postgraduate editorial team. However, when we ran this idea past senior staff from our Humanities faculty and with responsibility for postgraduate researchers (PGRs), there was little appetite for supporting any type of student journal, and since the Library and the Press aim to support the University in achieving its strategic goals we have parked this idea, for now. That said, we do still see value in students experiencing publishing either as authors or as part of an editorial team, which is why we decided to harness the expertise of our Press in the development of online learning modules which anyone on campus with an interest in publishing can access and learn from.
From what we hear about other institutions it seems that our experience is at odds with current trends in support for student publishing, ie, there appear to be many examples of libraries, academics and university presses launching student journals. We’ve been keen to understand if the challenges that have limited our service development are unique to Manchester and to learn more about how other institutions are providing support for student journals. So, as part of our latest CHERIL-funded project (Publishing Research and Learning for Students – PuRLS), we recently held a one day conference on student publishing. We wanted to bring together institutions with experience of student publishing or an interest in student publishing so that we could all learn from each other. The event, held on 16th January 2017, brought together a mixture of librarians, publishers, academic staff, administrative staff and students.
Libraries supporting student journals
Our contributors from the universities of Surrey, Warwick and Edinburgh, and Leeds Beckett University described their involvement with student journals. In all cases journals are run on OJS. At Edinburgh and Warwick, the libraries offer journal hosting services which publish both student and academic-level journals.
Although Edinburgh has a university press, the Library developed the hosting service independently. Angela Laurins, Library Learning Services Manager, explained that the service developed organically and is now well established, providing only set-up support for new journals; thereafter, journals are managed by their own editorial teams. Angela confirmed that this model works well, with minimal resource requirement. In fact, it works so well that she no longer requires a named academic champion for established journals if the previous champion moves on.
Warwick’s service is a more recent development, building on two journals already developed within academic departments and further interest from other areas for more new journals, together with available skills and resource within the Library to develop and manage journals, using OJS’s externally hosted option. Yvonne Budden, Head of Scholarly Communications, talked about two multi-disciplinary journals, Reinvention and Exchanges.
Reinvention, an international journal, is student-led and student-run, with academic support. The main resource requirement is in maintaining high quality. Academic staff carry out peer review and help students improve the standard of their work. Reinvention has received over 460 submissions and published approximately 130 articles. Submissions are split fairly evenly between disciplines and also come from a number of different countries. Yvonne explained the value that the library can bring to publishing is in part “things libraries are known for being good at”, eg, advising on open access, ISSNs, copyright, DOAJ registration, digital preservation, analytics.[presentation-slides-warwick-jan-2017]
Charlotte Barton, an Information Literacy Librarian, talked about her role in supporting the Surrey Undergraduate Research Journal (SURJ). The interdisciplinary journal is published by the Learning Development Team, which comprises librarians and learning advisors, and accepts work marked as 2.1 or higher, as well as reflective accounts, conference reviews and literature reviews. The editorial team is made up of academic staff and PGRs – PGRs stay in this role for a maximum of one year (two journal issues) and carry out peer review as well as other editorial tasks.
Charlotte explained that supporting prospective authors is time-intensive (1-1 support is provided by the SURJ team) but as submission rates are currently low (10 per issue) further work needs to be done on promoting the journal to academic colleagues. Future plans also include working with academic staff to develop training materials, eg, to improve writing skills. [presentation-slides-surrey-jan-2017]
Kirsty Bower, an Academic Librarian at Leeds Beckett University, described how the interest in setting up journals at her institution resulted from Open Access (OA) requirements for the next Research Excellence Framework (REF) and likely requirements of the Teaching Excellence Framework (TEF). An existing Sociology UG journal, Critical Reflections, was moved onto OJS in 2016 following a discussion with the lead academic, who was keen to increase visibility after producing a number of print issues. The journal publishes pieces produced in third year module, in which students apply their sociological knowledge to real life situations, and students are involved in the editorial process. Kirsty reported that despite limited promotion downloads have surpassed expectations, although she acknowledged that it isn’t clear who the readers are. Although the Leeds Beckett team face similar challenges to other institutions (eg, limited staffing resource, limited funding for promotion), they are considering developing a multi-disciplinary journal. [presentation-slides-leedsbeckett-jan-2017]
Publishing Manager Lara Speicher explained that at UCL Press student journals are hosted on OJS but run themselves, as long as they have support from their academic department. Proposals for new journals are not considered without agreement of faculty support – this commitment is vital as UCL Press is too small to provide high levels of support to students. Lara highlighted that it can be difficult to explain the difference between a hosting service and a publishing service, and explained that one journal had expected more ‘hand holding’ from the Press. Providing support for students ties in with UCL’s Connected Curriculum which brings research into learning. UCL Press have recently appointed a new journals manager who has plans for further support, eg, creating a forum for journal teams to meet and share experiences and delivering workshops on the publishing process. [presentation-slides-uclpress-jan-2017]
Tom Grady, Acting Press Manager, told us that WRUP launched in 2016 with the aim of publishing academic journals and books, so when the first journal proposal received was for a student journal there were some concerns. These included whether publishing a student journal would undermine the Press’s aspiration to become a reputable academic publisher, how sustainable a student journal would be, and who would read a student journal. Having since overcome these concerns the Press has recently launched the Undergraduate Journal of Politics and International Relations, which has an academic lead and funding sources, represents a gap in the market, and gives students the opportunity to be published authors or to be part of the editorial team. [presentation-slides-wrup-jan-2017]
The Manchester perspective, Part 2
We invited a number of speakers connected with the University of Manchester to contribute to the event, to increase awareness of potential challenges or opportunities for institutions considering dissemination of student research as a means to enhance the student experience.
The key driver when we were considering supporting student journal came from the Manchester Medical School, and particularly from a group of students, including Josh Burke. Josh explained that one reason for wanting to set up a journal was that medical students get points for publishing work in journals that are indexed in PubMed that count in applications for their first post. The group believed that they could set up a journal themselves but sought support from academic staff, who put them in touch with us. We provided access to OJS and publishing expertise from MUP; the students developed a staged peer review system and brought a lot of energy to the initiative, which resulted in the launch of Manchester Medical Journal (MMJ) in late 2016. MMJ is student-led and student-run. Josh admitted that using OJS was a pain point, as the peer review system developed doesn’t work easily within the OJS workflows, and that the student group had been naïve about the complexity of setting up and running a journal, needing academic support, publishing guidance and financial support. With the backing of the Medical School and continued investment of the group of students who initially set up the journal, MMJ seems likely to have a future. However, the main challenge is convincing students to publish in a new journal that isn’t indexed in PubMed. [presentation-slides-burke-jan-2017]
A similar view is shared by senior academic and administrative staff at Manchester, particularly in relation to PGRs. We asked Professor Maja Zehfuss, Associate Dean for PGR in the Faculty of Humanities, to outline this position at the event. The key points she made were that at Manchester institutional journals are not considered to be right for PGR publications, that PGRs should be seeking to publish papers of at least 3* ranking in ‘grown-up’ journals, that submitting papers to established journals provides a tough learning experience for PGRs which develops resilience and skills, and she queried what student journals are for and who reads them.
Of course, journals are only one means of scholarly communication, and at Manchester academic staff are incorporating different forms within their modules. Dr John Zavos, a course leader from Religions and Theology, explained that he was keen on openness in research and wanted to develop resources that would put his students’ work in the public domain, eg, ‘Poppy Hijab’, an exhibit on the Museum of the South Asian Diaspora blog. John is now leading a CHERIL-funded project exploring impactful public-facing platforms and hopes to incorporate editorial management of a blog into his Level Two course to provide further opportunities for publishing experience.
To conclude the event Simon Bains, our Deputy Librarian and Head of Research Support, and Meredith Carroll, Journals Manager from MUP, described our experience, which is summarised in the first part of this piece. [presentation-slides-manchester-jan-2017] For now, our support for student publishing takes the form of a recently-launched blog, The Publishing Exchange, to encourage reflection and learning, and My Research Essentials online resources, all available under the CC-BY-NC licence:
This week is Peer Review Week – what better time to announce the launch of a peer review elearning resource we’ve recently developed at Manchester?
At the University of Manchester Library, we work closely with our colleagues at Manchester University Press in support of a number of the University’s strategic goals. One benefit of our collaboration is that we can provide scholarly communication development opportunities for researchers and students.
Currently we are working together on a project funded by the University’s Centre for Higher Education Research, Innovation and Learning (CHERIL). The Publishing and Research Learning for Students (PuRLS) project aims to provide opportunities and resources to help students and early career researchers develop an awareness of the publishing process and the skills to participate as an author, editor and peer reviewer.
We believe that the resources will support students and postgraduate researchers who want to set up and manage their own journal or simply learn about academic publishing, and also enhance their employability within academia or the publishing sector. Feedback from medical students involved in our previous CHERIL project (SOAR – Student Open Access Research) has informed the focus of the resources we’re creating, and we’re finalising further usability testing at the moment.
The online modules have been created by drawing on expertise from the Library, the Press and the wider University community. Meredith Carroll, Journals Manager at Manchester University Press, prepared text content which the Library’s elearning team has turned into interactive resources, using Articulate Storyline 2 software.
The peer review module takes approximately 30 minutes to complete and includes activities which allow users to take on the role of a reviewer, eg, responding to scenarios and critiquing real peer reviews. Naturally our peer review resource has been peer reviewed – for this we asked a number of our academic colleagues for their expert input.
The peer review online module is available via the Library’s My Research Essentials webpage and is licensed as CC-BY.