Help us improve by providing feedback or contacting help@jisc.ac.uk
Research Problem
Rationale / Hypothesis
Method
Results
Analysis
Interpretation
Real World Application

Interpretation of focus group and interview feedback on Octopus

Publication type:Interpretation
Published:
Language:English
Licence:
CC BY 4.0
Peer Reviews (This Version): (0)
Red flags:

(0)

Actions
Download:
Sign in for more actions
Sections

Insights into research culture

This round of interviews provided insights into academic research culture, some of which are comparable to what was learned from our baseline evaluation. For example, there is a desire for more granular recognition of research labour which is perceived to be inadequate in traditional peer-reviewed journals. Study participants appreciated being able to publish elements of research as it is being conducted, and compared publishing on Octopus to pre-registration.

However, many expressed frustration at existing barriers within academic institutions preventing the more open sharing of research processes and products. Several participants are sceptical that they or their colleagues will have time to change established ways of working, especially when doing so may be detrimental to their career prospects. Many of these concerns are due to the strong pressure (perceived to be coming from employers and funders) to publish in high-profile academic journals.

Another commonly-expressed fear is that of scooping. In general, the fear is of direct plagiarism, where others may pass off published work as their own without proper attribution. However, plagiarism is not limited to open research or publications on Octopus, and could still take place even in traditional forms of academic publishing.

Fear of scooping also manifests as concern that if research is published too early, competing researchers would take someone's work and "beat" them to winning grants or publishing a high-profile paper (even if attribution is provided). It perhaps seems remarkable that "helping them get there quicker"–as described by one focus group participant–is a disincentive for sharing research in the current incentive structure, and that there is currently a perceived lack of collaborative approach for the benefit of society and individuals affected by the research.

The current study involved librarians and researchers in the humanities, who were not represented in our baseline evaluation of research culture (Hsing et al. 2023). They shared a perception that the needs, workflows, and research culture of non-STEM (science, technology, engineering, and mathematics) fields of research may not have received adequate consideration in the design of Octopus. They also implied that this bias goes beyond Octopus, and is present in general open research discourse.

Perceptions of Octopus

The interviews and focus groups displayed a variety of perceptions of Octopus, which can inform future development and communications about this platform, and engagement with researchers.

Some participant input also demonstrated possible misconceptions of the key aims of Octopus or its features.

For example, several participants suggested integrating Octopus publications with infrastructures for measuring reach and impact. They range from quantified metrics such as Altmetrics to assessments of public engagement or outreach. However, Octopus publications–as the "version of record" and specifically designed to serve the needs of detailed record as opposed to readability for a general audience–are not meant to serve these goals.

Similarly, a common reaction amongst participants is that Octopus reminds them of GitHub (https://www.github.com/), a popular online platform for publishing and managing software code. While some of the comparisons they drew were accurate, such as forking repositories akin to branching chains of Octopus publications, others were not. Specifically, code repositories on GitHub (or similar platforms) are being continuously developed, and do not act as versions of record as they are on Octopus.

These misconceptions also illustrate how platforms used for open research may serve multiple roles, and that researchers may be confused about the delineation between them. Continuing the GitHub example, users of repositories hosted on this platform can choose to create tagged code snapshots that are archived on Zenodo. In this sense, this GitHub user is creating a version of record, similar to functionality provided by Octopus. However, participants in the current study did not show finer-grained understanding of how Octopus seeks to serve some goals (e.g. acting as a version of record) while not others (e.g. providing continuous tracking of research work). This further suggests an opportunity for Octopus to highlight its key differentiators from other popular platforms used for open research.

These misconceptions are also evident from the feedback received regarding Octopus peer reviews. Several researchers compared them to those published by communities such as Peer Community In (PCI). However, while some of these efforts publish the contents of their peer reviews, they are of preprints or traditional journal articles. In contrast, Octopus peer reviews are of other Octopus publications, and themselves are considered to be full publications in their own right.

Interestingly, the comparisons to GitHub and peer review communities converged in one comment, where one researcher noted that open source code published on GitHub would sometimes receive comments in the form of discussion threads. These threads can act as a question-and-answer session with the code maintainer(s), highlighting bugs or suggesting improvements. In other words, they often serve as post-publication peer review of open source code. While an individual discussion might focus on one aspect of the code, popular repositories accumulate a large number of discussions that may–in aggregate–provide a more comprehensive, collective review of that work. It may be possible for Octopus peer reviews to collectively behave in a similar way. One potential challenge is that readers might find it difficult to ascertain how comprehensively a particular Octopus publication has been peer reviewed, especially if they have to read a large number of reviews.

Policy implications

As reported here and in our Analysis, participants highlighted several barriers to open research and publishing on Octopus, from pressure to publish in traditional academic journals to fears of scooping. Some also discussed the role of research policy in shaping these tensions.

For example, one librarian described a government initiative to create new academic journals in the primary language of a Southeast Asian country. The literature review in our baseline evaluation of research culture suggests that the dominance of English academic journals creates inequalities in the dissemination of research. The creation of these new journals may alleviate that problem. However, they do not alter the underlying incentive structure where traditional peer-reviewed articles are valued above other outputs. In contrast, Octopus seeks to fundamentally change the research publishing system while providing multilingual support.

Critically, both librarians and researchers noted that Octopus is currently supported by UK Research and Innovation (UKRI) through Research England. They stressed that this support is inadequate on its own and asked: "What does UKRI want to get out of this?" and "...[has UKRI] planned anything to incentivize people to use it?"

Funders

This Interpretation has the following sources of funding:

Conflict of interest

This Interpretation does not have any specified conflicts of interest.