Tag Archives: Training

How effective is your RDM training?

We are involved in an international collaborative project  to assess the quality of the Research Data Management training across institutions. This post reports on progress of the project so far, it originally appeared on the project blog on 6th October 2017. 

When developing new training programmes, one often asks oneself a question about the quality of training. Is it good? How good is it? Trainers often develop feedback questionnaires and ask participants to evaluate their training. However, feedback gathered from participants attending courses does not answer the question how good was this training compared with other training on similar topics available elsewhere. As a result, improvement and innovation becomes difficult. So how to objectively assess the quality of training?

In this blog post we describe how, by working collaboratively, we created tools for objective assessment of RDM training quality.

Crowdsourcing

In order to objectively assess something, objective measures need to exist. Being unaware of any objective measures for benchmarking of a training programme, we asked Jisc’s Research Data Management mailing list for help. It turned out that a lot of resources with useful advice and guidance on creation of informative feedback forms was readily available, and we gathered all information received in a single document. However, none of the answers received provided us with the information we were looking for. To the contrary, several people said they would be interested in such metrics. This meant that objective metrics to address the quality of RDM training either did not exist, or the community was not aware of them. Therefore, we decided to create RDM training evaluation metrics.

Cross-institutional and cross-national collaboration

For metrics to be objective, and to allow benchmarking and comparisons of various RDM courses, they need to be developed collaboratively by a community who would be willing to use them. Therefore, the next question we asked Jisc’s Research Data Management mailing list was whether people would be willing to work together to develop and agree on a joint set of RDM training assessment metrics and a system, which would allow cross-comparisons and training improvements. Thankfully, the RDM community tends to be very collaborative, which was the case also this time – more than 40 people were willing to take part in this exercise and a dedicated mailing list was created to facilitate collaborative working.

Agreeing on the objectives

To ensure effective working, we first needed to agree on common goals and objectives. We agreed that the purpose of creating the minimal set of questions for benchmarking is to identify what works best for RDM training. We worked with the idea that this was for ‘basic’ face-to-face RDM training for researchers or support staff but it can be extended to other types and formats of training session. We reasoned that same set of questions used in feedback forms across institutions, combined with sharing of training materials and contextual information about sessions, should facilitate exchange of good practice and ideas. As an end result, this should allow constant improvement and innovation in RDM training. We therefore had joint objectives, but how to achieve this in practice?

Methodology

Deciding on common questions to be asked in RDM training feedback forms

In order to establish joint metrics, we first had to decide on a joint set of questions that we would all agree to use in our participant feedback forms. To do this we organised a joint catch up call during which we discussed the various questions we were asking in our feedback forms and why we thought these were important and should be mandatory in the agreed metrics. There was lots of good ideas and valuable suggestions. However, by the end of the call and after eliminating all the non-mandatory questions, we ended up with a list of thirteen questions, which we thought were all important. These however were too many to be asked of participants to fill in, especially as many institutions would need to add their own institution-specific feedback questions.

In order to bring down the number of questions which should be made mandatory in feedback forms, a short survey was created and sent to all collaborators, asking respondents to judge how important each question was (scale 1-5, 1 being ‘not important at all that this question is mandatory’ and 5 being ‘this should definitely be mandatory’.). Twenty people participated in the survey. The total score received from all respondents for each question were calculated. Subsequently, top six questions with the highest scores were selected to be made mandatory.

Ways of sharing responses and training materials

We next had to decide on the way in which we would share feedback responses from our courses and training materials themselves . We unanimously decided that Open Science Framework (OSF) supports the goals of openness, transparency and sharing, allows collaborative working and therefore is a good place to go. We therefore created a dedicated space for the project on the OSF, with separate components with the joint resources developed, a component for sharing training materials and a component for sharing anonymised feedback responses.

Next steps

With the benchmarking questions agreed and with the space created for sharing anonymised feedback and training materials, we were ready to start collecting first feedback for the collective training assessment. We also thought that this was also a good opportunity to re-iterate our short-, mid- and long-term goals.

Short-term goals

Our short-term goal is to revise our existing training materials to incorporate the agreed feedback questions into RDM training courses starting in the Autumn 2017. This would allow us to obtain the first comparative metrics at the beginning of 2018 and would allow us to evaluate if our designed methodology and tools are working and if they are fit for purpose. This would also allow us to iterate over our materials and methods as needed.

Mid-term goals

Our mid-term goal is to see if the metrics, combined with shared training materials, could allow us to identify parts of RDM training that work best and to collectively improve the quality of our training as a whole. This should be possible in mid/late-2018, allowing time to adapt training materials as result of comparative feedback gathered at the beginning of 2018 and assessing whether training adaptation resulted in better participant feedback.

Long-term goals

Our long-term goal is to collaboratively investigate and develop metrics which could allow us to measure and monitor long-term effects of our training. Feedback forms and satisfaction surveys immediately after training are useful and help to assess the overall quality of sessions delivered. However, the ultimate goal of any RDM training should be the improvement of researchers’ day to day RDM practice. Is our training really having any effects on this? In order to assess this, different kinds of metrics are needed, which would need to be coupled with long-term follow up with participants. We decided that any ideas developed on how to best address this will be also gathered in the OSF and we have created a dedicated space for the work in progress.

Reflections

When reflecting on the work we did together, we all agreed that we were quite efficient. We started in June 2017, and it took us two joint catch up calls and a couple of email exchanges to develop and agree on joint metrics for assessment of RDM training. Time will show whether the resources we create will help us meet our goals, but we all thought that during the process we have already learned a lot from each other by sharing good practice and experience. Collaboration turned out to be an excellent solution for us. Likewise, our discussions are open to everyone to join, so if you are reading this blog post and would like to collaborate with us (or to follow our conversations), simply sign up to the mailing list.

Resources

Mailing list for RDM Training Benchmarking: http://bit.ly/2uVJJ7N

Project space on the Open Science Framework: https://osf.io/nzer8/

Mandatory and optional questions: https://osf.io/pgnse/

Space for sharing training materials: https://osf.io/tu9qe/

Anonymised feedback: https://osf.io/cwkp7/

Space for developing ideas on measuring long-term effects of training: https://osf.io/zc623/

Authors (in alphabetical order by surname):

Cadwallader Lauren, Higman Rosie, Lawler Heather, Neish Peter, Peters Wayne, Schwamm Hardy, Teperek Marta, Verbakel Ellen, Williamson, Laurian, Busse-Wicher Marta

Supporting student publishing: perspectives from the University of Manchester and beyond

student_publishing_event1

The Manchester perspective, Part 1

For the past couple of years we’ve been giving some thought to the role of university libraries in publishing, in common with other libraries. However, the University of Manchester is home to Manchester University Press (MUP), one of the largest university presses in the UK, so we’ve had to think carefully about how to work collaboratively to make best use of our respective expertise and resources in order to meet the University’s strategic objectives. Our initial thinking and work started in 2014 as part of the Library’s strategic programme, with follow-on projects funded by the University’s Centre for Higher Education Research, Innovation and Learning (CHERIL).

When we started our thinking, we expected that the outcome would likely be some kind of publishing support service, using Open Journal Systems (OJS) for hosting. To develop a tangible offer, we had many discussions about which parts of the support service would naturally sit with the Press and which in the Library, and even more about funding and sustainability. To date, our collaboration has resulted in:

  • development of Manchester Open Library as an imprint of MUP,
  • launch of the James Baldwin Review,
  • development of a student journal for the Manchester Medical School, and
  • development of 3 online learning resources on ‘publishing’,

but not in the publishing support service we originally envisaged. Instead we most recently considered offering a model that we believed would be sustainable with a low level of support, a multi-disciplinary undergraduate journal managed by a postgraduate editorial team. However, when we ran this idea past senior staff from our Humanities faculty and with responsibility for postgraduate researchers (PGRs), there was little appetite for supporting any type of student journal, and since the Library and the Press aim to support the University in achieving its strategic goals we have parked this idea, for now. That said, we do still see value in students experiencing publishing either as authors or as part of an editorial team, which is why we decided to harness the expertise of our Press in the development of online learning modules which anyone on campus with an interest in publishing can access and learn from.

From what we hear about other institutions it seems that our experience is at odds with current trends in support for student publishing, ie, there appear to be many examples of libraries, academics and university presses launching student journals. We’ve been keen to understand if the challenges that have limited our service development are unique to Manchester and to learn more about how other institutions are providing support for student journals. So, as part of our latest CHERIL-funded project (Publishing Research and Learning for Students – PuRLS), we recently held a one day conference on student publishing. We wanted to bring together institutions with experience of student publishing or an interest in student publishing so that we could all learn from each other. The event, held on 16th January 2017, brought together a mixture of librarians, publishers, academic staff, administrative staff and students.

Libraries supporting student journals

Our contributors from the universities of Surrey, Warwick and Edinburgh, and Leeds Beckett University described their involvement with student journals. In all cases journals are run on OJS. At Edinburgh and Warwick, the libraries offer journal hosting services which publish both student and academic-level journals.

Although Edinburgh has a university press, the Library developed the hosting service independently. Angela Laurins, Library Learning Services Manager, explained that the service developed organically and is now well established, providing only set-up support for new journals; thereafter, journals are managed by their own editorial teams. Angela confirmed that this model works well, with minimal resource requirement. In fact, it works so well that she no longer requires a named academic champion for established journals if the previous champion moves on.

Warwick’s service is a more recent development, building on two journals already developed within academic departments and further interest from other areas for more new journals, together with available skills and resource within the Library to develop and manage journals, using OJS’s externally hosted option. Yvonne Budden, Head of Scholarly Communications, talked about two multi-disciplinary journals, Reinvention and Exchanges.

Reinvention, an international journal, is student-led and student-run, with academic support. The main resource requirement is in maintaining high quality. Academic staff carry out peer review and help students improve the standard of their work. Reinvention has received over 460 submissions and published approximately 130 articles. Submissions are split fairly evenly between disciplines and also come from a number of different countries. Yvonne explained the value that the library can bring to publishing is in part “things libraries are known for being good at”, eg, advising on open access, ISSNs, copyright, DOAJ registration, digital preservation, analytics.[presentation-slides-warwick-jan-2017]

Charlotte Barton, an Information Literacy Librarian, talked about her role in supporting the Surrey Undergraduate Research Journal (SURJ). The interdisciplinary journal is published by the Learning Development Team, which comprises librarians and learning advisors, and accepts work marked as 2.1 or higher, as well as reflective accounts, conference reviews and literature reviews. The editorial team is made up of academic staff and PGRs – PGRs stay in this role for a maximum of one year (two journal issues) and carry out peer review as well as other editorial tasks.

Charlotte explained that supporting prospective authors is time-intensive (1-1 support is provided by the SURJ team) but as submission rates are currently low (10 per issue) further work needs to be done on promoting the journal to academic colleagues. Future plans also include working with academic staff to develop training materials, eg, to improve writing skills. [presentation-slides-surrey-jan-2017]

Kirsty Bower, an Academic Librarian at Leeds Beckett University, described how the interest in setting up journals at her institution resulted from Open Access (OA) requirements for the next Research Excellence Framework (REF) and likely requirements of the Teaching Excellence Framework (TEF). An existing Sociology UG journal, Critical Reflections, was moved onto OJS in 2016 following a discussion with the lead academic, who was keen to increase visibility after producing a number of print issues. The journal publishes pieces produced in third year module, in which students apply their sociological knowledge to real life situations, and students are involved in the editorial process. Kirsty reported that despite limited promotion downloads have surpassed expectations, although she acknowledged that it isn’t clear who the readers are. Although the Leeds Beckett team face similar challenges to other institutions (eg, limited staffing resource, limited funding for promotion), they are considering developing a multi-disciplinary journal.  [presentation-slides-leedsbeckett-jan-2017]

Presses supporting student publishing

Our speakers, from UCL Press and White Rose University Press (WRUP), are at very different stages of developing their services for students.

Publishing Manager Lara Speicher explained that at UCL Press student journals are hosted on OJS but run themselves, as long as they have support from their academic department.  Proposals for new journals are not considered without agreement of faculty support – this commitment is vital as UCL Press is too small to provide high levels of support to students. Lara highlighted that it can be difficult to explain the difference between a hosting service and a publishing service, and explained that one journal had expected more ‘hand holding’ from the Press. Providing support for students ties in with UCL’s Connected Curriculum which brings research into learning. UCL Press have recently appointed a new journals manager who has plans for further support, eg, creating a forum for journal teams to meet and share experiences and delivering workshops on the publishing process. [presentation-slides-uclpress-jan-2017]

Tom Grady, Acting Press Manager, told us that WRUP launched in 2016 with the aim of publishing academic journals and books, so when the first journal proposal received was for a student journal there were some concerns. These included whether publishing a student journal would undermine the Press’s aspiration to become a reputable academic publisher, how sustainable a student journal would be, and who would read a student journal. Having since overcome these concerns the Press has recently launched the Undergraduate Journal of Politics and International Relations, which has an academic lead and funding sources, represents a gap in the market, and gives students the opportunity to be published authors or to be part of the editorial team. [presentation-slides-wrup-jan-2017]

The Manchester perspective, Part 2

We invited a number of speakers connected with the University of Manchester to contribute to the event, to increase awareness of potential challenges or opportunities for institutions considering dissemination of student research as a means to enhance the student experience.

The key driver when we were considering supporting student journal came from the Manchester Medical School, and particularly from a group of students, including Josh Burke. Josh explained that one reason for wanting to set up a journal was that medical students get points for publishing work in journals that are indexed in PubMed that count in applications for their first post. The group believed that they could set up a journal themselves but sought support from academic staff, who put them in touch with us. We provided access to OJS and publishing expertise from MUP; the students developed a staged peer review system and brought a lot of energy to the initiative, which resulted in the launch of Manchester Medical Journal (MMJ) in late 2016. MMJ is student-led and student-run. Josh admitted that using OJS was a pain point, as the peer review system developed doesn’t work easily within the OJS workflows, and that the student group had been naïve about the complexity of setting up and running a journal, needing academic support, publishing guidance and financial support. With the backing of the Medical School and continued investment of the group of students who initially set up the journal, MMJ seems likely to have a future. However, the main challenge is convincing students to publish in a new journal that isn’t indexed in PubMed. [presentation-slides-burke-jan-2017]

A similar view is shared by senior academic and administrative staff at Manchester, particularly in relation to PGRs. We asked Professor Maja Zehfuss, Associate Dean for PGR in the Faculty of Humanities, to outline this position at the event. The key points she made were that at Manchester institutional journals are not considered to be right for PGR publications, that PGRs should be seeking to publish papers of at least 3* ranking in ‘grown-up’ journals, that submitting papers to established journals provides a tough learning experience for PGRs which develops resilience and skills, and she queried what student journals are for and who reads them.

Of course, journals are only one means of scholarly communication, and at Manchester academic staff are incorporating different forms within their modules. Dr John Zavos, a course leader from Religions and Theology, explained that he was keen on openness in research and wanted to develop resources that would put his students’ work in the public domain, eg, ‘Poppy Hijab’, an exhibit on the Museum of the South Asian Diaspora blog. John is now leading a CHERIL-funded project exploring impactful public-facing platforms and hopes to incorporate editorial management of a blog into his Level Two course to provide further opportunities for publishing experience.

To conclude the event Simon Bains, our Deputy Librarian and Head of Research Support, and Meredith Carroll, Journals Manager from MUP, described our experience, which is summarised in the first part of this piece.  [presentation-slides-manchester-jan-2017] For now, our support for student publishing takes the form of a recently-launched blog, The Publishing Exchange, to encourage reflection and learning, and My Research Essentials online resources, all available under the CC-BY-NC licence: