Following the introduction of GDPR last May the Research
Services team have been getting more and more enquiries about how to handle
sensitive data, so we invited Dr Scott
Summers from the UK Data Service (UKDS)
to visit us and deliver a one-day workshop on ‘Managing and sharing research
data from human participants’. My colleague, Chris Gibson, worked with Scott to
develop and arrange the session. It was a thoroughly engaging and informative
day, with lots of opportunity for discussion.
The workshop attracted a group of 30 to come along and learn
more about best practice for managing personal data. We invited colleagues from
across all faculties and ensured that there was a mix of established and early
career researchers, postgraduate researchers and professional services staff that
support research data management. As well as getting advice to help with data
management, the aim was to gather feedback from attendees to help us to shape
sessions that can be delivered as part of the Library’s My
Research Essentials programme by staff from across the University including
Research Services, Information Governance and Research IT.
As a fairly new addition to the Research Services team, I
was keen to attend this workshop. The management of research data from human
participants is a complex issue so any opportunity to work with the experts in
this field is very valuable. My job involves working with data management plans
for projects which often include personal data so gaining a deeper
understanding of the issues involved will help me to provide more detailed
advice and guidance.
The workshop began with looking at the ethical and legal
context around gathering data. This is something that has been brought sharply
into focus with the introduction of GDPR. We use ‘public
task’ as our lawful basis for processing data but it was interesting to hear
that ‘consent’ may be more prevalent as the preferred grounds in some EU
countries. Using public task as a basis provides our participants with
reassurance that the research is being undertaken in the public interest and
means researchers are not bound by the requirement to refresh consent.
The session on informed consent led to lively discussion
about how to be clear and specific about how and what data will be used when
research may change throughout a project. One solution for longitudinal studies
may be process consent – including multiple points of consent in the study
design to reflect potential changing attitudes of participants. Staged consent
is an option for those wanting to share data but give participants options. The
main point that arose from this session is that we should aim to give
participants as much control over their data as possible without making the
research project so complicated as to be unworkable.
The final session generated debate around whether we can
ever truly anonymise personal data. We worked through exercises in anonymising
data. It quickly became apparent that when dealing with information relating to
people, there are many aspects that could be identifying and in combination
even seemingly generic descriptors can quickly narrow down to a small subset of
participants. For example, ‘Research Officer’ is a term that could apply to a
large group of people but mention this in relation to ‘University of Manchester
Library’ and it quickly reduces to a subset of 3 people! The general consensus
was that referring to data as ‘de-identified’ or ‘de-personalised’ would be
more accurate but that these descriptions may not be as reassuring to the
participants so it is imperative that consent forms are clear and unambiguous
about how data will be used.
At the end of the session it was great to hear lots of
positive feedback from researchers across many disciplines that the workshop
took what could be quite a dry topic and made it engaging with numerous
opportunities for discussion.
Our second workshop with Scott Summers is due to take place on 26th February and we are looking forward to gaining more feedback and insights into how we can enhance the support we deliver to researchers who are managing research data from human participants – so, watch this space!
I was delighted to win a tuition scholarship to attend this year’s Force11 Scholarly Communications Summer School in San Diego, California. The demanding pace of our work in the Library’s Research Services division means it’s tricky to take time out to consider our work in broader contexts. I was therefore grateful for the opportunity to spend a whole week debating pressing issues and potential innovations in the scholarly ecosystem with researchers, fellow librarians and thought leaders, especially in such a beautiful location with the chance of a trip to the beach!
Force11’s Summer School (#FSCI) is more intense than a conference, both in duration and active participation: I signed up for classes, not talks, taking place over 5 days. I chose classes on the nature of collaboration in research; alternative approaches to peer review; and exploring what we mean by public Humanities – topics that I hoped would allow me to both apply and stretch my existing practical experience of supporting research dissemination. On the whole, classes were well-structured with a combination of expert insight, stimulating practical activities, and thought-provoking discussions. These were my highlights from each class:
I loved hearing from Daniel O’Donnell and Maryann Martone on the concept of the Scholarly Commons, developed from considering what our system of scholarship would look like if we started it from scratch nowadays, with access to the internet and public funding. The Commons is ‘an extension of the Open Science concept,’ a ‘conceptual space or spaces onto which we can map principles, best practices, service and standards that govern the production and dissemination of scholarly and research works so that they are maximally useful to all who need or want them.’ Its underlying principles are still being developed, and we were encouraged to contribute our suggestions. We considered how implementation of the Commons principles could disrupt the scholarly ecosystem, with my group envisioning a dynamic system of research communication centred on the connections between research objects, allowing continuous, versioned peer review rather than final, formal publication. Our instructors likened this approach to Cameron Neylon’s aggregation model of scholarly communication.
This provocative class unapologetically generated more questions than answers. We’re increasingly comfortable thinking about Digital Humanities, but what about Open or Public Humanities? Can we have a Humanities which is Open or Public without being Digital? Can the Humanities be Public without being Open? (I think not). What do we even mean when we talk about the Humanities? Faced with instructor Samantha Wallace’s challenging question, ‘Can the University be removed from the Humanities?’, I was forced to confront my proprietorial stance, realising for the first time my assumption that expression or culture only become ‘the Humanities’ when the academy gets involved. I also recognised unpleasantly cynical and paternalistic notes to my thinking about Public Humanities, with assumptions about outreach work or community engagement as impact-demonstrating add-ons to research projects. I was grateful for UCSD Library’s Erin Glass’ insight that it’s unhelpful to refer blandly to ‘the public’ – this is an anonymous signifier for what are in reality distinct, identifiable communities with whom we in academic institutions should seek to build real relationships. Despite the often abstract discussion, I left this class with a practical takeaway. Prompted by Sidonie Smith’s comment on platforms and tools that ‘Just trying to stay abreast of what’s out there becomes a dizzying affair,’ I want to explore the Library’s role in supporting Humanist researchers interested in working more publicly and openly, perhaps through developing expertise with relevant platforms, tools and methods and sharing this with Humanities researchers through personal consultation. I’ll also be considering the class reflections of Micah Vandergrift, one of our instructors, for further thinking and ideas.
Aside from deepening my understand of scholarly communication, especially problematic aspects of the traditional research publishing ecosystem and emerging challenges to these, the most valuable aspect of the Summer School was the opportunity to meet colleagues from around the world. Delegates represented six continents (no applicants from Antarctica sadly!), and it was amazing to share experiences of managing Open Access funds with a librarian from Canada; discuss our Library’s DMP service with a research student from Chile; hear the plans of one of the first Scholarly Communication librarians to be appointed in Nigeria; and consider new theories of research collaboration developed by sociologists from Russia. Everyone I spoke to was passionate about open scholarship, generous with their insights and unafraid to challenge assumptions with nuanced arguments. The people – instructors, delegates and organisers – made FSCI a stimulating and inspiring event, and I left California with a refreshed sense of purpose and creativity which I hope to channel into enhancing our scholarly communication support at Manchester.
We are involved in an international collaborative project to assess the quality of the Research Data Management training across institutions. This post reports on progress of the project so far, it originally appeared on the project blog on 6th October 2017.
When developing new training programmes, one often asks oneself a question about the quality of training. Is it good? How good is it? Trainers often develop feedback questionnaires and ask participants to evaluate their training. However, feedback gathered from participants attending courses does not answer the question how good was this training compared with other training on similar topics available elsewhere. As a result, improvement and innovation becomes difficult. So how to objectively assess the quality of training?
In this blog post we describe how, by working collaboratively, we created tools for objective assessment of RDM training quality.
In order to objectively assess something, objective measures need to exist. Being unaware of any objective measures for benchmarking of a training programme, we asked Jisc’s Research Data Management mailing list for help. It turned out that a lot of resources with useful advice and guidance on creation of informative feedback forms was readily available, and we gathered all information received in a single document. However, none of the answers received provided us with the information we were looking for. To the contrary, several people said they would be interested in such metrics. This meant that objective metrics to address the quality of RDM training either did not exist, or the community was not aware of them. Therefore, we decided to create RDM training evaluation metrics.
Cross-institutional and cross-national collaboration
For metrics to be objective, and to allow benchmarking and comparisons of various RDM courses, they need to be developed collaboratively by a community who would be willing to use them. Therefore, the next question we asked Jisc’s Research Data Management mailing list was whether people would be willing to work together to develop and agree on a joint set of RDM training assessment metrics and a system, which would allow cross-comparisons and training improvements. Thankfully, the RDM community tends to be very collaborative, which was the case also this time – more than 40 people were willing to take part in this exercise and a dedicated mailing list was created to facilitate collaborative working.
Agreeing on the objectives
To ensure effective working, we first needed to agree on common goals and objectives. We agreed that the purpose of creating the minimal set of questions for benchmarking is to identify what works best for RDM training. We worked with the idea that this was for ‘basic’ face-to-face RDM training for researchers or support staff but it can be extended to other types and formats of training session. We reasoned that same set of questions used in feedback forms across institutions, combined with sharing of training materials and contextual information about sessions, should facilitate exchange of good practice and ideas. As an end result, this should allow constant improvement and innovation in RDM training. We therefore had joint objectives, but how to achieve this in practice?
Deciding on common questions to be asked in RDM training feedback forms
In order to establish joint metrics, we first had to decide on a joint set of questions that we would all agree to use in our participant feedback forms. To do this we organised a joint catch up call during which we discussed the various questions we were asking in our feedback forms and why we thought these were important and should be mandatory in the agreed metrics. There was lots of good ideas and valuable suggestions. However, by the end of the call and after eliminating all the non-mandatory questions, we ended up with a list of thirteen questions, which we thought were all important. These however were too many to be asked of participants to fill in, especially as many institutions would need to add their own institution-specific feedback questions.
In order to bring down the number of questions which should be made mandatory in feedback forms, a short survey was created and sent to all collaborators, asking respondents to judge how important each question was (scale 1-5, 1 being ‘not important at all that this question is mandatory’ and 5 being ‘this should definitely be mandatory’.). Twenty people participated in the survey. The total score received from all respondents for each question were calculated. Subsequently, top six questions with the highest scores were selected to be made mandatory.
Ways of sharing responses and training materials
We next had to decide on the way in which we would share feedback responses from our courses and training materials themselves . We unanimously decided that Open Science Framework (OSF) supports the goals of openness, transparency and sharing, allows collaborative working and therefore is a good place to go. We therefore created a dedicated space for the project on the OSF, with separate components with the joint resources developed, a component for sharing training materials and a component for sharing anonymised feedback responses.
With the benchmarking questions agreed and with the space created for sharing anonymised feedback and training materials, we were ready to start collecting first feedback for the collective training assessment. We also thought that this was also a good opportunity to re-iterate our short-, mid- and long-term goals.
Our short-term goal is to revise our existing training materials to incorporate the agreed feedback questions into RDM training courses starting in the Autumn 2017. This would allow us to obtain the first comparative metrics at the beginning of 2018 and would allow us to evaluate if our designed methodology and tools are working and if they are fit for purpose. This would also allow us to iterate over our materials and methods as needed.
Our mid-term goal is to see if the metrics, combined with shared training materials, could allow us to identify parts of RDM training that work best and to collectively improve the quality of our training as a whole. This should be possible in mid/late-2018, allowing time to adapt training materials as result of comparative feedback gathered at the beginning of 2018 and assessing whether training adaptation resulted in better participant feedback.
Our long-term goal is to collaboratively investigate and develop metrics which could allow us to measure and monitor long-term effects of our training. Feedback forms and satisfaction surveys immediately after training are useful and help to assess the overall quality of sessions delivered. However, the ultimate goal of any RDM training should be the improvement of researchers’ day to day RDM practice. Is our training really having any effects on this? In order to assess this, different kinds of metrics are needed, which would need to be coupled with long-term follow up with participants. We decided that any ideas developed on how to best address this will be also gathered in the OSF and we have created a dedicated space for the work in progress.
When reflecting on the work we did together, we all agreed that we were quite efficient. We started in June 2017, and it took us two joint catch up calls and a couple of email exchanges to develop and agree on joint metrics for assessment of RDM training. Time will show whether the resources we create will help us meet our goals, but we all thought that during the process we have already learned a lot from each other by sharing good practice and experience. Collaboration turned out to be an excellent solution for us. Likewise, our discussions are open to everyone to join, so if you are reading this blog post and would like to collaborate with us (or to follow our conversations), simply sign up to the mailing list.
For the past couple of years we’ve been giving some thought to the role of university libraries in publishing, in common with other libraries. However, the University of Manchester is home to Manchester University Press (MUP), one of the largest university presses in the UK, so we’ve had to think carefully about how to work collaboratively to make best use of our respective expertise and resources in order to meet the University’s strategic objectives. Our initial thinking and work started in 2014 as part of the Library’s strategic programme, with follow-on projects funded by the University’s Centre for Higher Education Research, Innovation and Learning (CHERIL).
When we started our thinking, we expected that the outcome would likely be some kind of publishing support service, using Open Journal Systems (OJS) for hosting. To develop a tangible offer, we had many discussions about which parts of the support service would naturally sit with the Press and which in the Library, and even more about funding and sustainability. To date, our collaboration has resulted in:
development of Manchester Open Library as an imprint of MUP,
development of a student journal for the Manchester Medical School, and
development of 3 online learning resources on ‘publishing’,
but not in the publishing support service we originally envisaged. Instead we most recently considered offering a model that we believed would be sustainable with a low level of support, a multi-disciplinary undergraduate journal managed by a postgraduate editorial team. However, when we ran this idea past senior staff from our Humanities faculty and with responsibility for postgraduate researchers (PGRs), there was little appetite for supporting any type of student journal, and since the Library and the Press aim to support the University in achieving its strategic goals we have parked this idea, for now. That said, we do still see value in students experiencing publishing either as authors or as part of an editorial team, which is why we decided to harness the expertise of our Press in the development of online learning modules which anyone on campus with an interest in publishing can access and learn from.
From what we hear about other institutions it seems that our experience is at odds with current trends in support for student publishing, ie, there appear to be many examples of libraries, academics and university presses launching student journals. We’ve been keen to understand if the challenges that have limited our service development are unique to Manchester and to learn more about how other institutions are providing support for student journals. So, as part of our latest CHERIL-funded project (Publishing Research and Learning for Students – PuRLS), we recently held a one day conference on student publishing. We wanted to bring together institutions with experience of student publishing or an interest in student publishing so that we could all learn from each other. The event, held on 16th January 2017, brought together a mixture of librarians, publishers, academic staff, administrative staff and students.
Libraries supporting student journals
Our contributors from the universities of Surrey, Warwick and Edinburgh, and Leeds Beckett University described their involvement with student journals. In all cases journals are run on OJS. At Edinburgh and Warwick, the libraries offer journal hosting services which publish both student and academic-level journals.
Although Edinburgh has a university press, the Library developed the hosting service independently. Angela Laurins, Library Learning Services Manager, explained that the service developed organically and is now well established, providing only set-up support for new journals; thereafter, journals are managed by their own editorial teams. Angela confirmed that this model works well, with minimal resource requirement. In fact, it works so well that she no longer requires a named academic champion for established journals if the previous champion moves on.
Warwick’s service is a more recent development, building on two journals already developed within academic departments and further interest from other areas for more new journals, together with available skills and resource within the Library to develop and manage journals, using OJS’s externally hosted option. Yvonne Budden, Head of Scholarly Communications, talked about two multi-disciplinary journals, Reinvention and Exchanges.
Reinvention, an international journal, is student-led and student-run, with academic support. The main resource requirement is in maintaining high quality. Academic staff carry out peer review and help students improve the standard of their work. Reinvention has received over 460 submissions and published approximately 130 articles. Submissions are split fairly evenly between disciplines and also come from a number of different countries. Yvonne explained the value that the library can bring to publishing is in part “things libraries are known for being good at”, eg, advising on open access, ISSNs, copyright, DOAJ registration, digital preservation, analytics.[presentation-slides-warwick-jan-2017]
Charlotte Barton, an Information Literacy Librarian, talked about her role in supporting the Surrey Undergraduate Research Journal (SURJ). The interdisciplinary journal is published by the Learning Development Team, which comprises librarians and learning advisors, and accepts work marked as 2.1 or higher, as well as reflective accounts, conference reviews and literature reviews. The editorial team is made up of academic staff and PGRs – PGRs stay in this role for a maximum of one year (two journal issues) and carry out peer review as well as other editorial tasks.
Charlotte explained that supporting prospective authors is time-intensive (1-1 support is provided by the SURJ team) but as submission rates are currently low (10 per issue) further work needs to be done on promoting the journal to academic colleagues. Future plans also include working with academic staff to develop training materials, eg, to improve writing skills. [presentation-slides-surrey-jan-2017]
Kirsty Bower, an Academic Librarian at Leeds Beckett University, described how the interest in setting up journals at her institution resulted from Open Access (OA) requirements for the next Research Excellence Framework (REF) and likely requirements of the Teaching Excellence Framework (TEF). An existing Sociology UG journal, Critical Reflections, was moved onto OJS in 2016 following a discussion with the lead academic, who was keen to increase visibility after producing a number of print issues. The journal publishes pieces produced in third year module, in which students apply their sociological knowledge to real life situations, and students are involved in the editorial process. Kirsty reported that despite limited promotion downloads have surpassed expectations, although she acknowledged that it isn’t clear who the readers are. Although the Leeds Beckett team face similar challenges to other institutions (eg, limited staffing resource, limited funding for promotion), they are considering developing a multi-disciplinary journal. [presentation-slides-leedsbeckett-jan-2017]
Publishing Manager Lara Speicher explained that at UCL Press student journals are hosted on OJS but run themselves, as long as they have support from their academic department. Proposals for new journals are not considered without agreement of faculty support – this commitment is vital as UCL Press is too small to provide high levels of support to students. Lara highlighted that it can be difficult to explain the difference between a hosting service and a publishing service, and explained that one journal had expected more ‘hand holding’ from the Press. Providing support for students ties in with UCL’s Connected Curriculum which brings research into learning. UCL Press have recently appointed a new journals manager who has plans for further support, eg, creating a forum for journal teams to meet and share experiences and delivering workshops on the publishing process. [presentation-slides-uclpress-jan-2017]
Tom Grady, Acting Press Manager, told us that WRUP launched in 2016 with the aim of publishing academic journals and books, so when the first journal proposal received was for a student journal there were some concerns. These included whether publishing a student journal would undermine the Press’s aspiration to become a reputable academic publisher, how sustainable a student journal would be, and who would read a student journal. Having since overcome these concerns the Press has recently launched the Undergraduate Journal of Politics and International Relations, which has an academic lead and funding sources, represents a gap in the market, and gives students the opportunity to be published authors or to be part of the editorial team. [presentation-slides-wrup-jan-2017]
The Manchester perspective, Part 2
We invited a number of speakers connected with the University of Manchester to contribute to the event, to increase awareness of potential challenges or opportunities for institutions considering dissemination of student research as a means to enhance the student experience.
The key driver when we were considering supporting student journal came from the Manchester Medical School, and particularly from a group of students, including Josh Burke. Josh explained that one reason for wanting to set up a journal was that medical students get points for publishing work in journals that are indexed in PubMed that count in applications for their first post. The group believed that they could set up a journal themselves but sought support from academic staff, who put them in touch with us. We provided access to OJS and publishing expertise from MUP; the students developed a staged peer review system and brought a lot of energy to the initiative, which resulted in the launch of Manchester Medical Journal (MMJ) in late 2016. MMJ is student-led and student-run. Josh admitted that using OJS was a pain point, as the peer review system developed doesn’t work easily within the OJS workflows, and that the student group had been naïve about the complexity of setting up and running a journal, needing academic support, publishing guidance and financial support. With the backing of the Medical School and continued investment of the group of students who initially set up the journal, MMJ seems likely to have a future. However, the main challenge is convincing students to publish in a new journal that isn’t indexed in PubMed. [presentation-slides-burke-jan-2017]
A similar view is shared by senior academic and administrative staff at Manchester, particularly in relation to PGRs. We asked Professor Maja Zehfuss, Associate Dean for PGR in the Faculty of Humanities, to outline this position at the event. The key points she made were that at Manchester institutional journals are not considered to be right for PGR publications, that PGRs should be seeking to publish papers of at least 3* ranking in ‘grown-up’ journals, that submitting papers to established journals provides a tough learning experience for PGRs which develops resilience and skills, and she queried what student journals are for and who reads them.
Of course, journals are only one means of scholarly communication, and at Manchester academic staff are incorporating different forms within their modules. Dr John Zavos, a course leader from Religions and Theology, explained that he was keen on openness in research and wanted to develop resources that would put his students’ work in the public domain, eg, ‘Poppy Hijab’, an exhibit on the Museum of the South Asian Diaspora blog. John is now leading a CHERIL-funded project exploring impactful public-facing platforms and hopes to incorporate editorial management of a blog into his Level Two course to provide further opportunities for publishing experience.
To conclude the event Simon Bains, our Deputy Librarian and Head of Research Support, and Meredith Carroll, Journals Manager from MUP, described our experience, which is summarised in the first part of this piece. [presentation-slides-manchester-jan-2017] For now, our support for student publishing takes the form of a recently-launched blog, The Publishing Exchange, to encourage reflection and learning, and My Research Essentials online resources, all available under the CC-BY-NC licence: