Categories
Thought

Open Access Week 2018 musings: engaging authors, new staff and publishers

OA social graphic

When we’re planning for Open Access (OA) Week we reflect on where we’ve got to in our services, both in the delivery and the level of researcher engagement with OA.

It’s always rewarding for us to remember how well established our service now is and the important part we play in increasing access to the University’s research and, of course, funder compliance. This year we worked with colleagues in the University’s Global Development Institute to showcase their OA research, which aligns with the theme of OA Week 2018, and highlighted our top 5 most accessed theses.

nick&rachel

It’s also rewarding to be out and about on campus, talking to researchers about OA. This year librarians from our Academic Engagement team held OA pop-up events in various buildings, away from the Library, and a screening of Paywall: the Business of Scholarship in a lecture theatre.

Levels of engagement with OA at the University are high – while it’s undoubtedly true that this is related to funder policies , it’s also partly because our services are designed to make the process easy for authors. OA isn’t always easy for researchers to understand but our process is, and it prompts conversations with us about what to do, and the reasons why, all year round. Our webpages underpin our current processes but now – we’ve just launched new-look webpages – also look ahead, encouraging and supporting engagement with Open Research more broadly.

What I’ve been reminded of as we’ve been preparing for OA Week is that however well we’re doing at the moment, there are still challenges to tackle. And I’m not referring to Plan S.

Working in an Open Access service

OA teams have formed and grown over the past 5 years. Most of us learned on the job and we’re now training the new colleagues on the job. I’m part of a national group considering how best to prepare the next generation of people for work in this area. One way we’re doing this is by inviting staff already working in this area to share their experiences.

We often receive applications for our vacancies that suggest a lack of understanding about the nature of the roles so I’ve asked Lucy May and Olivia Rye from our team to talk about what it means to work in a role with a strong focus on Gold OA at a large research-intensive university.

liv lucy

See Cambridge’s Unlocking Research blog for examples of other types of Scholarly Communication roles.

OA monographs and book chapters

A further challenge is OA monographs and book chapters. We really need greater clarity on publisher processes as they relate to OA for these output types. Over the past week we’ve been reviewing the status of 14 payments we arranged for our School of Arts, Languages and Cultures from our 2017-18 institutional OA fund (last payment made in early September), totalling just over £61,000. Of these, 6 outputs are not yet OA. Another output, a monograph, is not flagged as OA on the publisher’s page. This may be an oversight, but it’s telling of developments still needed – the publisher of this book told the author that they don’t have processes in place for this yet.

Of the 6 outputs, two were book chapters, from a commercial publisher that I assume has a process, because they have a webpage offering OA for chapters as an option, but although I’ve had an apology I’ve not yet had confirmation of when the chapters will be OA. One was an article from a US University Press – I had a fast response and apology but have been told it will take at at least a week for the article to be made OA.

The 3 remaining outputs are monographs. From the responses I’ve had I’m understanding that there’s a delay in converting a monograph to OA once a Book Processing Charge is made – what I’ve yet to learn is how long this is likely to be. We can’t have meaningful discussions with authors without this kind of information and the lack of publisher procedures affects confidence in engagement with OA.

So, this is now on my To Do list both here at Manchester and for the RLUK Open Access Publisher Processes group. By the time we’re planning OA Week activities next year, and reflecting on how far we’ve come, I’m determined we’ll have answers.

Categories
Thought

Database Interface Design – washing powder optional

As a librarian supporting students and academic staff over the last 20 years, I have substantial experience of working with digital content and currently work with around 30 electronic databases. My ability to work productively and assist students in gathering information or data is influenced significantly by interface design – a critical component of the user experience (‘UX’) of these databases.

The following represent my personal views on interface design.

 

Why is interface design important?

The ability of a user to work well with an interface influences their productivity. Design changes to the front end of any database system are comparable to the challenges faced with a new computer operating system. For example, changing from: Windows 3.1 to 95 to 98 to 2000 to XP to 7 to 8 to 10 – meaning the familiar is no longer so, resulting in one’s productivity going down, at least initially.

 

Changing search interface / results display

I am not advocating that interface design is set in stone and appreciate that carefully considered changes can generate substantial improvements. What is important is that those changes are driven by user needs and then tested to ensure they have met those needs. Otherwise they can cause major difficulties for students. The Passport database is such an example, where a simple and intuitive process appears to have been replaced in such a way that it is far less clear and valuable functionality has disappeared altogether. Previously, when selecting the ‘Menu Search’ option, a three-step process was revealed:

Topics →  Geography →  Results

Passport. Menu search. Logical search steps.
Passport. Geography search stage (old version).

A helpful feature was the ‘predefined selections’. Here, many country groupings (e.g. EU, G20) are clearly visible and easy to select.

The ‘Results List’ screen displayed a variety of categories available to select (e.g. Industry Overview, Local Company Profile).

Passport. Results screen (old version).
Passport. Results screen (old version).

 

The look and feel of the interface has now changed, in such a way as to reduce the effectiveness of the interface. I find it very confusing, with the results screen not showing the options of the old version.

Passport. Search/results (new version).
Passport. Search/results (new version).

I’m not alone, if the comments on the Business Librarians Association (BLA) bulletins are a guide. So if I and my professional colleagues are having difficulty, students must be in an even worse position.

Mock-up Washing powder boxIt is hard to see these changes as being driven, as they should be, by attention to the user and how they work. Instead they feel to me more like change for the sake of change. Just as washing power might be branded ‘New and Improved!’ so these changes  to the look and functionality of a database must be a step forward over what went before. Mustn’t they? Well, no.

 

The user experience (UX)

The Japanese concept of Kaizen, Kaizen in Japanese characters. By Majo statt Senf - Created with Affinity Designer - Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=38767688 which translates roughly as ‘continuous improvement’ is pertinent here.

 

This principle runs through effective UX developments. With the emergence of ‘the perpetual beta‘ and web 2.0, users can be seen as co-developers, through an ability to feed into service changes.  By monitoring the take up (or not) of new features, service providers can reflect user needs.

Such an approach drives a user-centred evolution of a system interface, introducing incremental and necessary improvements over what has gone before. This is far preferable to a revolutionary approach, resulting in drastic changes to the interface design which risk causing significant disruption to its users.

This reflects the position taken by Joe Hardenbrook, that a state of permanent beta is OK, when feedback is received and adapted to, enabling improvements to flow from this.

For example, in undertaking a review process, the provider of the Factiva database has reassuringly put the user experience at the heart of the exercise. This includes undertaking interviews, with the goal of improving productivity, through ‘compact, clear and consistent design’.

 

Why do effective design features matter?

Stability

Does the database have a look which persists over time? Some good examples, in my opinion, include Thomson Research and Factiva. Both the search interfaces for these databases have remained similar for over five years. The search screen is a single page, with no scrolling down required and all options clearly visible. These can be selected by typing in a free text box or using a ‘look-up’ option from a magnifying glass icon. Other selection methods are equally clear, from clicking to select options (e.g. Report Type: [Company, Industry] in Thomson Research) or using a drop down menu (Date in Factiva [e.g. In the last week]). In Factiva, search options flow from top to bottom, in a single column, on the left, culminating in a blue colour-coded ‘search’ button, at the lower right of screen. Thomson Research has two columns of search options and again a blue colour-coded ‘search’ button, bottom left of screen.

Familiarity contributes to ease of use. If a database has reached a clear and effective design, there is value in maintaining this appearance. Why is this important? Because students (and library staff) have many demands on their time. When faced with a radically altered search interface, time is wasted trying to get familiar with the new ‘look’. So I would urge our suppliers to ensure such changes respond to user needs and behaviours rather than simply to refresh the brand so that it looks ‘new and improved’.

Some suppliers are seeking insights through user engagement, to improve their products. For example, EBSCO Information Services seeks insight into today’s student as an ‘information seeker’. Kate Lawrence, Vice President for User Experience Research, noted students have an ever increasing desire for efficiency, approaching their studies by looking at priority (which deadline is first) and looking at the return on investment in terms of getting the best results for the amount of time invested. [1] Being aware of such requirements allows a provider to meet these user needs.

 

Clear search options

By providing options clearly visible on the search screen, alternatives can be selected without confusion. A prime example of this is the Factiva database (Business News). Using the ‘Search – Search Builder’ option from the drop down menu, many alternatives become apparent on the left of the screen. For example, Date, Source, Subject – which can be easily explored and used as part of a search. Should it need to be refined, after viewing the results, this can be achieved by simply clicking on the ‘Modify Search’ button to retrieve the original search options.

I sought comments from a colleague who works in a different area of the library, relating to database design. He noted that a lack of familiarity makes usage more troublesome – in his case, identifying financial databases, which I often work with. Further, the examples of Compendex, IEEE Xplore, Scopus and Web of Science were noted as being ‘easy to use’, having good basic and advanced search options. I think this again reflects a point made earlier – to be able to proceed in a search, without confusion.

 

Enhancements

‘Evolution rather than revolution’ can be translated to mean small improvements, which  leave the basic search design unchanged. Again, in Factiva (in the past), to indicate a phrase search in the text box at the top of the screen, two or more words had to be included within quotation marks. For example, “Tesla Model S”, which searches for these terms in this exact format, rather than as keywords scattered throughout the documents being searched. Now, however, this is done automatically. If you attempt to enter quotation marks, a message appears on screen noting: ‘Please check your query, we have identified unbalanced quotes’. This is a good example of an enhancement, which  maintains the basic search interface.

Factiva searching
Factiva: search screen.

 

 

Additional challenges faced

Name changes

This is a hindrance, in that students are no longer aware of a particular title and the location, when accessing from an A to Z listing. Some examples include:

Global Market Information Database  →  GMID  →  Passport.

Global Insight  →  IHS Connect.

Removing content / features

KeyNote:  this market research database previously showed a comprehensive listing of sections for an individual report on the left hand menu (e.g. Market Size, Competitor Analysis) which I found really helpful. This no longer appears, which I believe is a backward step.

Mintel:  by splitting up and scattering the constituent sections of a market research report on a results page, rather than offering a PDF link to display as a single report (available for KeyNote reports) it is far less user-friendly. I have witnessed students and library staff finding the layout  confusing.

 

Summary

I appreciate excellent database design when faced with a substantial enquiry and a tight deadline – reflected in stability and a clear, intuitive search interface.

Incremental improvements to a database are preferable to radical changes, when seeking to support students and academic staff in their research. This process can be enhanced through testing and should be driven by user needs, such as the Factiva review noted above.

In conclusion, I can do without the washing powder mentality – ‘new and improved’ – as it isn’t always so.

[1] Lawlor, B.  An Overview of the NFAIS 2015 Annual Conference: Anticipating Demand: The User Experience as Driver. Information Services & Use. 2015, v.35(2), p.15.

 

Categories
Discussion Thought

Open Access and Academic Journal Markets: a Manchester View

cartoon-pat

 

In February, a thought piece was issued jointly by Jisc, RLUK, SCONUL and ARMA which aimed to start a conversation about academic journal markets and progress in the UK towards Open Access. This blog post represents the combined thoughts of two leaders in Open Access publishing at the University of Manchester Library. The post does not represent an official position at Manchester, but illustrates some of the thinking that informs the development of our policies and services.

The thought piece makes a number of statements, and we have chosen to respond to a selection of them:

Academic journals play an important role in the work of universities

In our view, one might argue instead that academic research papers play an important role, and that the correlation is between availability of that research and university research performance.  The journals just happen to be the containers for the research.  The same is true of student satisfaction and access to journals.  Students want access to the ‘stuff’; whether it’s in journals is largely immaterial, and may not even be noticeable via modern library discovery systems, or Google.  The question is whether the journal remains the best container in a networked digital environment.

Two issues in particular occur to us in the context of this part of the thought piece:

i) We wonder how true it is that journals ‘allow researchers the freedom to choose appropriate channels to publish their work’.  It could be argued that they are, in fact, constrained by a system in which they are expected to publish in certain titles if they are to develop their careers;

ii) It’s true that journal articles are measurable, insofar as citations are a reliable indicator, but there’s growing support for a campaign to eliminate journal title-based metrics, with over 600 organisations now signed up to the San Francisco Declaration on Research Assessment.

The markets are changing

The thought piece describes a market that has been split in two, where the options now are Hybrid Gold and Pure Gold. We would suggest there are other ways to think about the way the journal market is evolving.  Although many of the ‘wholly Open Access journals’ levy APCs, we should keep in mind – and OA advocates often remind us – that many Gold OA journals do not require them.  Nobody is suggesting that publishing is free, but charging the author at time of acceptance is not the only approach.  It’s interesting to see the Open Library of the Humanities adopt a Library-funded model, something which Knowledge Unlatched has shown can succeed, at least at the pilot stage, for OA monographs.

Our second point would be that another way of splitting the market would be into a) commercial publishers and b) university presses.  Open Access has stimulated renewed interest in the concept of the University Press, as universities begin to consider how they could bring their publishing operations back in house.  UCL Press is a significant example of a new and wholly Open Access press, and more recently we have also seen a consortial approach emerge in the form of White Rose University Press.  Bringing scholarly publishing back into the academy allows us to present an alternative OA model in which prices do not need to be determined by shareholders and their demand for profits. The recent University Press Redux conference at Liverpool identified this as a key theme. We are also seeing innovation with repositories, such as the  arXiv overlay journal Discrete Analysis, launched in February.

Performance of the legacy/hybrid journals market

The anticipated transition to OA, post-Finch, still seems depressingly distant.  Instead, we continue to pay above-inflation subscription prices while simultaneously paying the same publishers APCs.  Despite the average cost of hybrid APCs being higher than those for Pure Gold, the power of the journal brand means that most of the funds we have available for Gold OA are going to hybrid publishers.  We are seeing some offsetting models emerge, but we are aware that some institutions find these  complicated to manage and while publishers have a global market which is not, on the whole, moving to Gold OA, there is little prospect of the transition we hoped for.  Pure Gold journals offer lower prices and no scope for ‘double-dipping’ but are yet to be well-established beyond a few disciplines.

On the point in the thought paper about the service we might expect for our APC payments, much certainly needs to be done.  We are both members of the RLUK Open Access Publisher Processes Group which focuses on this, and we welcome feedback from colleagues who are dissatisfied with publisher systems and procedures that authors struggle to navigate or the level of service and support received in return for their APC payments.

Shortcomings in the legacy journal market

Given that we have limited funds available to pay publishing costs, it is attractive to consider using them only to support publishers who are not also taking subscription payments from us.  It is increasingly so when we see that Pure Gold APCs tend to be lower than those charged by hybrid journals.  The issue we face is the power of the brand, as our researchers know they need their papers to be in the ‘right’ journals in order to gain the esteem they require to progress in their careers.  It is depressing that this remains the case in a digital world in which the concept of the journal is so outdated.  In a print environment, bundling the latest research papers up in this way was a sensible approach to their dissemination.  Today, new models like PeerJ can work quite differently, and the only barrier to their adoption is an academic culture which holds fast to the power of the journal title, even at a time when so many organisations are turning away from the notion that the impact of a journal says anything about the individual article. Hybrid, despite the arguments of the Publishers Association, is not providing what we need. As the Wellcome Trust reports, “hybrid open access continues to be significantly more expensive than fully open access journals and that as a whole the level of service provided by hybrid publishers is poor and is not delivering what we are paying for”.

Given the complexity of offsetting, the profit margins of the commercial publishers and the lack of a substantial transition from subscription to OA, it is time to consider using the available funding for Pure Gold rather than for Hybrid, and to invest in those initiatives that are emerging from academia, and which focus on providing the widest access to our research rather than the returns expected by company shareholders.

This post was jointly authored by Simon Bains and Helen Dobson 

Image: Patrick Hochstenbach, CC-BY. Open Access Belgium

 

Categories
Announcement Thought

Predatory publishers: who CAN you trust?

Open Access iconOne of our responsibilities as OA advisors at Manchester is to keep track of so-called predatory publishers, and advise our researchers on publishers they should avoid.  It can be hard to separate wheat from chaff, so we rely, where possible, on others helping to do this. Until recently, we recommended Jeffrey Beall’s list, a well-known directory maintained in the US.  However, we have now removed the link, and will no longer advise our colleagues to use it.  Here’s why …

Some of the work we do extends beyond Manchester and is about sharing our experience.  We are currently lead institution on opeNWorks, a Jisc-funded Pathfinder project which aims to share best practice with colleagues from the North West region who have limited experience of providing OA support for researchers and to develop a community of good OA practice.  The purpose of the community is to ensure that trusted advice and resources are easily accessible to institutions that are unable to fund a full-time OA support post.

If the resources and systems we have created are seen as examples of good practice then we’d like them to be representative of our views on OA and it is clear our views are not aligned with Beall’s.

On the basis of what we’ve read – the Berger and Cirasella article recently posted on the LSE Impact Blog provides a good overview and entry points – and what Beall seems to have said in his recent presentation at the US STM conference, here are a few points on which we differ.

Publishing costs

Let’s start simply.  There is a cost to publishing scholarly works.  We know this and we’ve had frank conversations with publishing colleagues on this issue.  In the subscription model authors (who may also be editors) tend to be unaware of the costs, and librarians are aware only of the costs to their own institutions.  What’s ethical about this lack of transparency?  It’s practically the OED definition of predatory (“unfairly competitive or exploitative”).

We’ve taken the recommendation of the Finch Group to heart and have shared the costs of publishing with University of Manchester authors as a first attempt to remedying this problem, telling them that the University spent a total of ~£5million on journal subscriptions in 2013-14 and informing individual authors of the cost of article processing charges (APCs) – added to which there may also be page, colour or submission charges, let’s not forget – paid on their behalf.  Most of the charges we’ve paid have been to publishers of subscription journals offering a hybrid gold option, along with most of the UK universities in receipt of OA grants from RCUK and COAF.  With even more money flowing from university libraries to large commercial publishers there’s a new chapter in the Serials Crisis – an urgent need for offsetting schemes to address the issue of double-dipping.  This work has already begun and we’re feeding into these discussions.   However, the models we’ve seen so far are early experiments that need further refinement to be truly ethical.

Tailoring advocacy

The Open Access team at The University of Manchester Library
The Gold Open Access team at The University of Manchester Library

OA advocacy is at the heart of our interactions with researchers and we tailor our message to audiences at a disciplinary level and to individual authors as required.  This is necessary to win the hearts and minds of researchers for whom subscription publishing is the cultural norm, or to encourage a new generation of researchers to confidently challenge the advice of their senior colleagues, who frequently fall into that first category.  And while we might repeat core messages, the effectiveness of our advocacy depends on the nuance, which requires the thinking that Beall sees as unnecessary.  We tell researchers about the OA publishing model, explain why they need to know (and as funded researchers and/or employees of a UK HEI they do need to know) and why they should care. The most effective message to some authors might be pragmatic (“you might jeopardise your chances of securing funding with a particular funding body if you don’t publish OA”) but we always include positive messages about extending readership and the public good.  I often relate the experience of researchers in other parts of the world with severely limited access to academic journals, based on the inspiring presentation I heard Erin McKiernan deliver at the 2014 SPARC OA meeting.

We find our advocacy activities most successful when we engage researchers in discussion based on our experiences of providing OA support, and this is as important for us as it is for the researchers because it allows us to understand the barriers to OA.  Mostly this is down to complexity of publisher workflows – traditional publishers that is – and remembering to choose an OA option.  We hear these concerns often, much more so than the questionable publishers Beall focuses on, and we respond to these concerns by participating in RLUK-led initiatives to engage publishers in discussions on the simplification of OA procedures or, at a local level, by reminding authors to make new papers OA, and we know that traditional publishers are also helping with this culture change.  This doesn’t mean that we are enemies of traditional publishers, as Beall might suggest, rather that their systems and workflows aren’t as intuitive for authors as they might believe, and the scale of support we provide to authors addressing problems relating to these publishers makes this a priority for us.

Supporting innovation

Support for OA in Word but not in Deed
Bizarre accusation of hypocrisy

But that’s not to say that we are simply reactionary in our approach to OA.  We do react, of course, to new funder policies, new publisher workflows, but we are also hugely supportive of new developments in scholarly communications, eg, JiscMonitor, ORCID, Altmetrics, and we are always interested in the emergence of new publishing models.  We have responded to requests from Manchester researchers who wish to publish RCUK-funded papers with PeerJ by setting up an institutional membership plan.  We are working in partnership with our colleagues at Manchester University Press, developing the Manchester Open Library imprint.  The latest journal in development is student-led and will operate a form of peer-review that MUP CEO, Frances Pinter, considers worthy of patenting.  We are also supporters of Knowledge Unlatched and the Open Library of Humanities, and are encouraged to see traditional publishers experimenting with OA monographs as the sector seeks a sustainable business model.  OA has created opportunities for experimentation and innovation in publishing, driven by energetic and passionate individuals.  There are too many to name but Martin Eve certainly deserves a mention after bizarrely being charged with hypocrisy in Beall’s STM presentation last week.

We don’t disagree with Beall on everything, eg, we don’t dispute the existence of questionable OA journals and publishers.  As fund managers for the University’s OA grants from RCUK and COAF we take our duty of care to authors and funders seriously.  Requests for APC payments prompt an extra Quality Assurance check in the publication process at Manchester which allows us to alert School Research Directors of submissions to journals of questionable reputation.  Our website advice also provides a checklist for authors to consider as part of their publication strategy and we’ll now focus on this type of guidance until a community-driven alternative to Beall’s list emerges.

Categories
Thought

Why are there so many business databases?

Browse The University of Manchester Library website and you will find a large number of business databases. Researchers have to chose which of these is best for their research and this can be influenced by various factors. To guide this decision it is important to remember the factors that lead to academic libraries having many business databases.

Commercial products

Business databases roundabout
There are many specialist financial and business databases available to researchers at The University of Manchester

The best known business databases are commercial products: for example, Bloomberg Professional, Datastream (Thomson Reuters), and Capital IQ (Standard & Poor’s) .  They are available to universities for non-commercial use but their main market is finance professionals. These systems are similar but not equivalent. The companies developing these systems are constantly trying to improve them to maintain or increase their market share, and often this includes providing data that is not available from competitors.

The big advantage of commercial products is that it gives students the chance to get experience of the very systems that they will be using when they get a job in the financial sector.

There is some information freely available on finance websites but this does not have the quality, range and history needed by researchers. There is no incentive for free websites to have accurate historical information on companies that are inactive (have been taken over or gone bust) and this is vital for research. Researchers therefore depend on commercial products as an essential source of research data.

Research interests

Researchers are always looking for new opportunities – being able to investigate new questions or test out theories in a wider context. Recently there has been increasing interest in executive compensation and related corporate governance data. In most cases there are specialist databases that provide data in this sort of area and these are the first choice of researchers as they offer fertile ground for novel analyses.

There are established research databases that only cover the American market. These are well known by reviewers for the better journals so are the preferred choice for researchers studying this market. If you wish to study global or emerging markets there is no single best choice. Some alternatives are global and others are more local and each has its own strengths and weaknesses. This means that research that looks at extending existing research results from one market to another often involves adapting the research methods to suit different databases.

The buyers of the best-known commercial databases are banks and investment companies interested in recent information on current public companies, or ones that they think might become public in the near future. They spend large amounts of money getting the quickest access to the latest information, not on getting the highest quality information for the longest historic time period. This has led to the creation of specialist databases that cater for researchers who want to study markets over a 20, 30 or 50 year time span.

In summary, researchers who want to do novel research and get this published in a good journal have a different criteria for choosing databases than an investment bank. As a result almost every university has a slightly different collection of business databases. There is significant overlap but the full list will reflect the research interests of the staff, and as a result can change over time.

Easy to add – hard to remove

Partial approximate time line of acquisition of business databasesThere are inevitable pressures that make it much easier to add a database to the offering than to remove one.

Research papers can take years to publish in a top journal, and if academics have found an interesting topic they usually want to publish more than a single paper. It is difficult to remove a database if there are any current or prospective papers in progress since researchers need a current licence to allow them to use the data and submit their results for publication.

If a database has become established as part of a taught course then staff build up a wealth of experience in guiding students to the specific information they need and steering clear of potential difficulties. As a result it is a big step to move to another database even if it offers access to the same information.

The main reason databases are removed is because companies supplying them want to replace them for commercial reasons. It is not uncommon for universities to ask for continued use of a database, which is no longer being actively developed, because there are “in progress” research publications or because time is needed to check that their replacement is tested for student use.

Selecting a database can also be challenging, but the Business Data Service is available for consultation and advice. Another blog post in the near future will look at the methods and support available for deciding which databases are best, and how the Research Services team is looking at providing innovative ways of presenting such information.