Help us improve by providing feedback or contacting help@jisc.ac.uk
Research Problem
Rationale / Hypothesis
Method
Results
Analysis
Interpretation
Real World Application

Bootstrap open research practices by inviting trainers to contribute to open, participatory, evaluation of training

Publication type:Rationale / Hypothesis
Published:
Language:English
Licence:
CC BY 4.0
Peer Reviews (This Version): (0)
Red flags:

(0)

Actions
Download:
Sign in for more actions
Sections

Alongside a train the trainer (T3) model for dissemination of open research methods, UKRN’s Open Research Project (ORP) will also bootstrap open research practices by inviting T3 attendees to contribute to open, participatory evaluation of the training that they deliver. This process of evaluation and refinement is expected to result it a distilled set of best practices for open research training. We propose that this process can be run effectively on Octopus.

Running the evaluation process as a participatory open research project is expected to give attendees a direct experience of open research. Complementary evaluation through focus groups and interviews will explore the roles that the activities performed by T3 attendees play in encouraging behaviour change. In particular, we will seek to understand the role that contributing to open evaluation plays in participants' developing practice.

What is reasonable to expect from T3 attendees?

Several levels of involvement can be anticipated.

  1. Given that this activity takes place in conjunction with a funded project, it is reasonable for project staff to collect data from T3 attendees, and for our training partners to give feedback on their plans for T1 delivery. — Indeed, this can be assumed to happen on the day, inside of T3 training.

  2. T3 attendees might get more involved in giving feedback through facilitated focus groups, or some other methods. — Focus group participation has had reasonable uptake in current pilots; it leaves the central project staff with the task of distilling and scaffolding best practices. In any case, training evaluation and feedback is assumed to be part of the project by default, whilst the exact method remains to be determined.

  3. T3 attendees get still more involved, with hands-on review of materials & feedback for other trainers. — This requires some further infrastructure to support it (e.g., community platforms).

The current proposal is at the third level, and assumes that the others are in place; it also assumes that participants will see advantages to such participation.

What material should be public?

The trainers will already have a lesson plan when they deliver their training. This is equivalent to Methods in the Octopus terminology. Typically, they will have an outline of preliminary plans in a draft form when they leave the T3 session.  Sharing this plan with training partners and co-attendees on the day is the default ("1") above; sharing the plans more widely would also project staff and other trainers to keep learning together and share feedback.  It would be a further step to share the plans publicly, and yet another to include a CC licence or similar.  In fact, we could offer rigorous review and feedback within a closed community setting. However, given that we are talking about open research, this is a good opportunity to ask people to be "as open as possible." Within the ORP, open (and therefore public) should be the default, rather than the other way around.

Is "research" required?

The proposal to use Octopus isn't intended to be shoehorning, or appropriation of research terminology in a non-research context ("evaluation of training").  Rather, this proposal intends to involve participants in a bona fide research project, namely design research to help build an effective open research training programme.  It assumes that we don't know everything that there is to know about how to do that.  It also assumes that one good source of data will come from learning from experience.

Asking T3 attendees to attend training, deliver training, and also participate in research on how to improve the training is asking a lot.  But, in fact, we are already asking them to participate in such evaluation in some way by default.  Therefore, if we are asking them to participate in research, shouldn't we ask them to participate in open research?

Could we do this in other ways?

Certainly we could. For example, we could create our own self-hosted version of Octopus and adjust the terminology to more closely match what participants might be expecting (e.g., "Methods" could become "Lesson plan").  That said, the detailed training resources themselves presumably be stored elsewhere.  Alternatively, we could share resources and evaluation in one site.  For now, it makes sense to get started using currently available tools, and then assess our next steps.

Self declaration

Data has not yet been collected to test this hypothesis (i.e. this is a preregistration)

Funders

This Rationale / Hypothesis has the following sources of funding:

Conflict of interest

This Rationale / Hypothesis does not have any specified conflicts of interest.