Help us improve by providing feedback or contacting help@jisc.ac.uk
Research Problem
Rationale / Hypothesis
Method
Results
Analysis
Interpretation
Real World Application

Possible improvements to Octopus to enable open research

Publication type:Real World Application
Published:
Language:English
Licence:
CC BY 4.0
Peer Reviews (This Version): (0)
Red flags:

(0)

Actions
Download:
Sign in for more actions
Sections

Cultural change can be difficult, even when initiated by a research funder. One interviewee represented a medical research funding body, and as described in our Analysis, their attempt to promote pre-registration to funded projects "fell flat" as it was not an ingrained part of research culture in the discipline they fund.

That said, Octopus represents a potential step change on the path to more open research by providing the needed infrastructure ("make it possible") and user experience ("make it easy") as described by the Center for Open Science (Nosek 2019).

The interviews and focus groups we conducted provide actionable insights for Octopus in terms of how it can be better communicated; technical improvements; user experience updates; and organisational changes.

Communicating Octopus

A frequently asked question during the interviews and focus groups was what differentiates Octopus from other platforms for publishing open research, such as Zenodo (https://www.zenodo.org/) or the Open Science Framework (OSF: https://osf.io/). As one participant asked: "Why would you choose this over something else?"

Currently, the Octopus homepage presents the platform as the "primary research record where researchers publish their work in full detail". Based on feedback we received, this may be inadequate in expressing key differentiators of Octopus such as its eight publications types; which benefits this structure confers; or the precise meaning of "version of record". Confusion about the latter is evident in the way participants compared Octopus to tools they are familiar with.

For example, several researchers compared Octopus to digital and computational notebooks, or software code repositories hosted on GitHub (https://www.github.com/). While these tools can indeed provide an open research record, the comparisons suggest a misunderstanding of how the version of record that Octopus seeks to create is different.

Similarly, some suggested using Octopus Peer Review publications as reviews of traditional papers, or integrating Octopus with open peer review initiatives. As discussed previously, this feedback implies a misunderstanding of the purposes of Peer Reviews published on Octopus.

With this in mind, communication about Octopus should include not only a description of what it is, but also what it means. A tangible change to the Octopus website could be a bullet point summary of its key features and what they mean for researchers.

It may be a challenge to communicate Octopus differentiators with conciseness and nuance. For instance, while comparisons to managing code on GitHub may be inaccurate, analogies to specific GitHub features may help with understanding Octopus. This includes the concept of "forking" code, which is analogous to the branching chains of interconnected Octopus publications. Or, GitHub workflows allow taking a tagged snapshot of code, and publishing it as the version of record on Zenodo. That snapshot could alternatively be published to Octopus if interoperability between platforms is achieved.

Some participants appreciated the granular recognition of research labour enabled by Octopus publications. However, they feared the practice of listing powerful researchers as authors–despite negligible contributions–could still occur for Octopus publications. This could be an opportunity for the Octopus team to highlight its publication types, where it would be more difficult to justify the inclusion of those authors on, for example, Methods, Results, or Analysis publications.

Or, Octopus could highlight how its constituent publication types would ease meta analyses, a benefit highlighted by focus group participants. One way to implement a meta analysis would be a single Analysis publication which links to multiple Results publications by other research groups. Another form would be a new chain of research starting with Research Problem that collates and analyses the methodologies (perhaps published as Methods) employed to study a particular topic.

Importantly, communication about Octopus should be sensitive to the diversity of possible users, especially those from non-STEM (science, technology, engineering, and mathematics) fields of research or not based in academic institutions. There are existing publications on Octopus representing a slice of this diversity, such as those on archaeology (e.g. Chi 2023), and could be used as examples during outreach. Care should also be taken to not inadvertently use language that is exclusionary of epistemological diversity. For example, as stressed by the historian in one focus group, the presentation of Octopus should not come across as pushing an experimental science-based paradigm onto other research fields.

In any case, continued user feedback and targeted engagement may be needed to further elucidate the precise ways to accurately communicate the usefulness of Octopus while representing diverse fields of research.

Technical changes

Several participants suggested allowing more machine-readable structured metadata to be attached to Octopus publications. This could be United Nations Sustainable Development Goals (SDGs) or assessment categories in the United Kingdom (UK) Research Excellence Framework (REF), though the latter could inadvertently create the impression that Octopus is only for research based in the UK.

Metadata could also express finer details about author contributions. Adopting existing standards such as CRediT may be a simple first step for this purpose, but there is substantial overlap between CRediT and the authorship contributions implied by the Octopus publication types. Instead, future work could develop a skills taxonomy that represents the authors who publish on Octopus, and allow tagging publications with them.

One interviewee touted the potential benefits of this metadata for career advancement. For example, a researcher could allow their profile on the professional networking platform LinkedIn (https://www.linkedin.com/) to extract information about their skills from Octopus publications. This could be recorded in the metadata of an Analysis publication that describes the specific statistical methods or lab techniques this researcher used.

In addition to metadata, there is desire from researchers of computation-intensive fields for programmatic ways to publish with Octopus. Specifically, these researchers often conduct their work with computational notebooks such as Jupyter (https://jupyter.org). Rather than manually uploading outputs through the Octopus web interface, an Octopus application programming interface (API) would allow researchers to write code for automating publication of computation outputs from these digital notebooks. In addition to convenience, being able to script interactions with Octopus improves reproducibility and enables interoperability with other platforms.

Interoperability would not only "make it easy" for individual researchers, but also research institutions. As previously described, one interviewee suggested that grant management systems used by funders are inadequate for tracking outputs from their funded research. If those systems could easily interoperate with Octopus, it may also make it easier for these funders to encourage open research among their grantees.

Interoperability can also be between self-hosted instances of Octopus. Here, an instance is defined as hosting the entire Octopus infrastructure–from backend to frontend (with possible cosmetic changes to branding)–on an independent server.

In practice, this could mean universities hosting their own Octopus instances, each holding only the publications from their institution. Interoperability would allow one to use any instance to access and retrieve information from any other instance in the connected network. This typology would not only decentralise storage and maintenance, but also reduce reliance on one provider.

User experience

In contrast with programmatic access to Octopus, several researchers identified friction in the user experience of the Octopus web-based publishing process.

For example, the author approval process was confusing to a librarian who has published on Octopus. Starting with the automated approval email, they struggled to understand whether they are being asked by a co-author to verify their identity; connection with the publication; or approval for it to be published.

Other participants stressed the need for Octopus to be easier to use and less restrictive than systems currently used by academic journals. This includes the ease with which Octopus publications are drafted and submitted, and advanced search functionality that minimises the mental effort and time for a visitor to find relevant publications. 

There was insufficient time during the interviews and focus groups to comprehensively elucidate specific pain points, and future work should include wider user testing to inform user experience improvements.

Organisational strategies

Our results suggest that the growth and sustainability of Octopus rely on coordination with other organisations, from research institutions, funding bodies, to traditional academic publishers.

Several participants based in the UK observed that Octopus is funded by United Kingdom Research and Innovation (UKRI), and asked about what UKRI is doing to incentivise researchers to publish on the platform. For the goals of Octopus to be realised–such as reducing publication bias, publishing diverse outputs, or recognition for different roles–our findings suggest that reforms to assessment by funding bodies are needed to ensure positive career outcomes from conducting open research.

Even though Octopus Peer Reviews are different from open peer reviews of traditional academic articles, there remains the potential to learn from how they are organised. For example, the Journal of Open Source Software (JOSS: https://joss.theoj.org) or pyOpenSci (https://www.pyopensci.org/about-peer-review/index.html) are academic journals where open peer review occur in discussion threads hosted on GitHub repositories. As mentioned in our Interpretation article, comments in these review threads may come from multiple readers, and might not be individually comprehensive. However, this collective review process could provide a comprehensive review in aggregate. Octopus could learn from this success when designing interventions to encourage the publication of Peer Review items.

Furthermore, coordination with academic journals could take the form of Octopus acting as the designated repository for the "supplementary material" often associated with traditional papers. Octopus can also be treated as a preprint server, where material eventually to be submitted to a journal could be published. This coordination should ensure that submissions to traditional journals should not be treated as plagiarism if their constituent material was previously published on Octopus. Implementing the above could at least partly alleviate the fear of being scooped, as expressed by study participants.

Octopus could also coordinate with open research infrastructure providers such as interoperability (e.g. OpenAIRE: https://www.openaire.eu/) or discovery services (e.g. Europe PMC: https://europepmc.org/). Interoperability with these established platforms may further establish the legitimacy of Octopus while increasing its visibility to researchers.

While some focus group participants expressed concern that some researchers, such as those in mathematics or the humanities, may feel resentment towards how Octopus is currently presented, this is also an opportunity for deeper engagement with those groups.

For example, Octopus could identify and engage with a few research groups of researchers working in disciplines currently under-represented on Octopus. The team would work closely with these groups, providing expert guidance throughout the research lifecycle on publishing their work to Octopus. This may result in not just a set of new Octopus publications, but also a complete case study for use in future outreach and ethnographic study to inform further improvements to the platform.

Finally, certain technical requirements for publishing on Octopus may be off-putting. For instance, one focus group participant noted that Octopus requires one to have an ORCID iD (https://orcid.org/) when creating an account, and that organisations listed in a publication appear to require Research Organization Registry IDs (ROR) (https://ror.org/) (although in fact this is not actually a requirement). While good practice for open research, the ubiquitousness of these identifiers is uneven across diverse fields of research and institutions. The participant observed that certain non-profit or government research organisations do not have ROR identifiers, and the use of ORCID iDs is lacking in some fields of research. For Octopus, managing this tradeoff between the use of open research technical standards and creating a welcoming experience for diverse researchers may be challenging.

Funders

This Real World Application has the following sources of funding:

Conflict of interest

This Real World Application does not have any specified conflicts of interest.