Open Research Programme Survey 2025 Report
UKRN ROR ID: 049czkm07
UKRN Working Paper number and date: 08 / 2026-02-20
Author Contributions:
- Jackie Thompson: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Validation, Visualization, Writing - original draft, and Writing - review & editing.
- David Shanks: Conceptualization, Funding acquisition, Investigation, Methodology, Project administration, and Writing - review & editing.
- Patrick Lewis: Conceptualization, Project administration, and Writing - review & editing.
- Neil Jacobs°: Conceptualization, Formal analysis, Investigation, Methodology, Validation, Writing - original draft, and Writing - review & editing.
- Alice Howarth: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Validation, Visualization, Writing - original draft, and Writing - review & editing.
- Bill Greenhalf: Conceptualization, Funding acquisition, Methodology, and Writing - review & editing.
- Lavinia Gambelli: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Validation, Visualization, Writing - original draft, and Writing - review & editing.
- Lisa DeBruine: Data curation, Formal analysis, Visualization, and Writing - review & editing.
- Ruth Davies: Conceptualization, Formal analysis, Investigation, Visualization, Writing - original draft, and Writing - review & editing.
- Cecina Babich Morrow: Data curation, Software, and Visualization.
°Corresponding author, neil.jacobs@bristol.ac.uk
Acknowledgee contributions:
- Noemie Aubert Bonn: Writing - review & editing.
- Banaz Jalil: Writing - review & editing.
- Marcus Munafo: Conceptualization, Funding acquisition, Supervision, and Writing - review & editing.
- Etienne Roesch: Data curation, Software, Validation, Visualization, and Writing - review & editing.
We acknowledge the support of the Jean Golding Institute, University of Bristol.
The Author contributions and Acknowledgee contributions were created using Tenzing (Holcombe et al. 2020).
Conflict of interest statement: Patrick Lewis declares Royal Society Industry fellow in partnership with LifeArc; Lavinia Gambelli, Ruth Davies, Alice Howarth, David Shanks, Neil Jacobs, Bill Greenhalf, Banaz Jalil, Lisa DeBruine, Marcus Munafo, Cecina Babich Morrow, and Etienne Roesch declare no competing interest.
Funding acknowledgement: Research England Development Fund award: Growing and Embedding Open Research in Institutional Practice and Culture
1 Executive summary
The UK Reproducibility Network’s Open Research Programme (ORP, 2021–2027) aims to accelerate the adoption of high-quality Open Research (OR) practices across UK institutions. This report presents findings from the 2025 ORP survey, the second in a planned series of three, designed to assess the prevalence of OR practices and attitudes toward them among researchers in partner institutions. The survey builds on lessons from the 2022–2023 Open and Transparent Research Practices (OTRP) survey and aligns with international efforts to harmonise monitoring of OR practices.
1.1 Methods
The survey instrument was based on the Brief Open Research Survey (BORS) to ensure conceptual consistency and brevity. It covered awareness, use, and attitudes toward 13 OR practices, including open access, open data, preprints, preregistration, and replication studies. Respondents also indicated perceived facilitators for adopting OR practices. Sampling aimed for representativeness across disciplines and career stages using institutional HESA (Higher Education Statistics Agency) data, with deployment managed locally by 19 partner institutions. In total, 1,408 responses were collected across 23 disciplines and three career levels.
1.2 Key Findings
- Awareness and Use: Awareness of OR practices is high overall, but uptake varies. Open access is near-universal (99% awareness, 86% use), while preregistration and citizen science awareness/use are lower (55%/25% and 66%/13%, respectively). Notably, awareness of FAIR data, a much-quoted aspect of OR, is the lowest of any OR practice at 51%. Disparities between awareness and use are greatest for replication studies, citizen science, and open code/software, suggesting persistent implementation barriers.
- Disciplinary and Career Differences: Quantitative and mixed-methods researchers report higher engagement than qualitative researchers, and uptake is generally greater in scientific disciplines than in arts and humanities. Senior researchers show higher awareness and use, though junior and mid-career researchers express strong motivation to engage more.
- Attitudes: Most respondents view OR as useful (80%), but only 42% feel their institution provides adequate training. Only 18% of respondents engaged with OR in hiring and promotion of staff.
- Facilitators: Practical enablers—guidance, infrastructure, time allocation—are prioritised over cultural drivers such as recognition in hiring and promotion. Interest in incentives and clearer guidance is strong, while demand for training is moderate, and varies among disciplines and career levels.
1.3 Discussion and Implications
The findings confirm that awareness alone is insufficient to embed OR practices. Structural and cultural barriers persist, including limited institutional support and weak integration of OR into assessment frameworks. Disciplinary norms remain a major axis of variation, underscoring the need for domain-sensitive guidance and exemplars. The survey also highlights opportunities for targeted interventions, particularly for early-career researchers and qualitative disciplines. Methodologically, the adoption of targeted, stratified sampling represents a step toward more representative datasets, though analysis of representativeness remains inconclusive.
1.4 Conclusion
UK efforts to advance OR are broadly aligned with global trends: foundational practices are widely adopted, but deeper systemic changes—especially around incentives and disciplinary adaptation—are still needed. These findings will inform ORP interventions and contribute to international efforts to harmonise monitoring and evaluation of OR practices.
2 Report
2.1 Background and aims
The Open Research Programme (ORP, 2021-2027) aims to accelerate the uptake of high-quality Open Research (OR) practices. Its evaluation, therefore, needs to demonstrate the extent to which it has achieved this aim. Evidence is being collected in three ways:
A survey of the research communities in partner institutions (the focus of this working paper);
The use by ORP partners of specific OR indicators, explored during 2024-2025, is planned to be reported in a forthcoming UKRN Working Paper.
Data collected by the ORP intervention projects related to their specific objectives.
The ORP’s first such survey, the Open and Transparent Research Practices (OTRP) survey, was conducted by 15 partner institutions in 2022-2023, and provided important insight for planning the ORP intervention projects, especially for prioritising training and informing the Open and Responsible Researcher Reward and Recognition (OR4) project on reform of research assessment. It also provided valuable lessons for future surveys, particularly regarding the importance of a representative sample. The data from the OTRP survey have been published (Hughes-Noehrer et al. 2024).
The ORP evaluation comprises two more surveys, in 2025 – the subject of this working paper – and a final survey scheduled for 2027. Planning for these two further surveys included learning lessons from the OTRP; coordinating and aligning with other surveys (in particular the Brief Open Research Survey [BORS; Norris-2022]; and drawing from expertise across the ORP.
The aim of the 2025 and 2027 surveys is to assess the prevalence of OR practices, and to provide an evidence base for investigating any observed changes in behaviour or prevalence of OR practices. Such changes will be compared with the various ways in which ORP partners have engaged with the ORP, and inferences will be drawn using a Bayesian approach as part of the 2027 survey.
The realisation of this aim was intended to have the following benefits:
For partner institutions, by providing insight and benchmarks on progress toward OR across disciplines and career stages;
For the ORP as a whole, by providing data that, when combined with evidence of the patterns of engagement with different ORP interventions, can help evaluate which interventions have been effective.
The ORP surveys were conducted in an international context where monitoring OR practices was high on the agenda. Apart from an international roll-out of the BORS instrument mentioned above, the Center for Open Science (COS) continued to deploy its Open Scholarship Survey, and many recent, relevant, ad hoc surveys have been reported in the literature (e.g., Mozersky et al. 2021; Gopalakrishna et al. 2022; Scoggins and Robertson 2024; Ferguson et al. 2023; Houtkoop et al. 2018; Baždarić et al. 2021; European Commission Directorate-General for Research and Innovation et al. 2022).
In addition, several initiatives aim to develop more automated approaches to monitoring, including the ORP indicators work noted above, dashboards developed by the Berlin Institute of Health, the ScreenIT group of tools, and various initiatives of the European Union, such as under the European Open Science Cloud. Attempting to bring some coherence to this landscape is the Open Science Monitoring Initiative, related to the 2021 UNESCO Recommendation on Open Science.
UKRN’s contribution to this move toward coherence is the development of a consensus statement (forthcoming) that outlines how those conducting such surveys, including UKRN, will coordinate their work and adopt the rigorous research practices promoted by the Reproducibility Networks. This recognises that data on OR, whether from surveys or automated tools, are increasingly used in meta-research studies.
2.2 Methods
2.2.1 Instrument design
The design of the survey instrument had to meet several requirements, namely:
elicit information on the prevalence of OR practices in ORP partner institutions across disciplines, to evaluate the success of the programme;
where possible, be consistent with other survey instruments, to strengthen moves toward consistency in the vocabulary and concepts used;
where possible, be consistent with the OTRP survey, although the OTRP’s open sampling strategy would make longitudinal comparisons difficult;
be as concise as possible, to minimise the attrition rate during completion that was observed in the OTRP survey.
Extended discussions, comparisons and mappings between existing instruments led to a decision to base the 2025 survey instrument closely on the BORS instrument. There was much consistency between the BORS and the OR practices covered by the OTRP survey, but the former was considerably simpler. Furthermore, discussions with the BORS team led to an agreement that they would not target ORP partner institutions in their 2025 open survey to avoid duplication; instead, they might use the open data from the 2025 survey to supplement and compare with their survey data, noting the different sampling strategies.
All survey materials including survey questionnaire, ethics application and approval, participant information sheet, survey recruitment email, briefing to UKRN Institutional Leads and Open Research Coordinators and Administrators (including sampling strategy), data analysis preregistration, and sample representativeness analysis are available on the UKRN Sharing Platform. The survey aggregate report, survey data and visualisation code are forthcoming on the same platform.
2.2.2 Question design
We undertook a comprehensive analysis of the OR practices covered by the OTRP 2022-2023, the BORS and the COS Open Scholarship Survey (OSS). We aligned the coverage from these surveys with the UKRN’s ORP training priorities (see training schema), the UKRN’s ORP indicators pilot priorities and the UNESCO Recommendation on Open Science priority areas. We also compared other key surveys and datasets in the area of OR. We therefore identified thirteen key OR practices as well as the umbrella term of ‘Open Research’ (note: throughout this report ‘OR’ refers to broad open research practices, and ‘open research’ indicates the specific practice listed in the survey question set). To account for disciplinary differences in terminology and definitions of these practices, we included a single-sentence description for each practice, aligned with similar descriptions used by BORS.
Respondents were asked to indicate their awareness of those practices (yes/no) as well as their use of those practices (I use it often/I have used it/I have NOT used it/not applicable to my research). To assess their attitudes, respondents were asked to indicate their level of agreement (Strongly agree / Somewhat agree / Neutral / Somewhat disagree / Strongly disagree) with nine statements reflecting a range of attitudes towards OR practices. To assess potential facilitators to engaging with OR practices, respondents were asked what would help them adopt more OR practices, and allowed to tick up to 5 options from an overall list of 12, including 10 possible facilitators, as well as options indicating they had no intention of adopting more, or that they were already doing all they could.
Finally, respondents were asked to provide their discipline based on categories adapted from HESA (Higher Education Statistics Agency), the research methods they use, their main research approach (qualitative, quantitative, both, neither) and their career level. Descriptions of these categories are available in the survey instrument on the UKRN Sharing Platform.
2.2.3 Sample strategy
The novel sampling strategy had to meet several requirements, namely to:
elicit information from a sample that was, as far as possible, representative of the research community in the ORP partner institutions;
be practical for institutions to implement in 2025 and again in 2027, with limited resources;
allow for comparison, where possible, with other datasets.
We took a practical approach to representativeness, focusing on the discipline and seniority of respondents and targeting active researchers (staff and post-graduate students) who would be engaging in OR practices directly, rather than professional services staff who might support and enable them. We focused on sampling across disciplines and seniority levels to support our survey aims. If the sampling strategy proves successful, we anticipate that future investigations might consider stratification across gender and other protected characteristics to answer specific research questions on demographic engagement with OR.
It is important to note that the survey was not administered centrally. Each ORP partner institution deployed it locally. There were several reasons for this, including assurances that personal data about those invited to participate would not leave their institutions.
Discussions within the ORP assessed three potential overall strategies against the three criteria noted above: partners generate the sample using internal data sources; partners work with the central ORP team to create the sample using internal and third-party data sources; conduct an open survey, and weight responses.
Comprehensive discussions, pilots, and advice from the University of Bristol Data Protection Officer led to a decision to adopt an approach in which ORP partners were given three options, with a strong preference for option A over option B and for option B over option C.
The options were as follows:
Option A: The ORP partner institutions use their HESA staff and student returns to generate the sample using detailed guidance from the ORP team. Once the sample was created, staff at the ORP partner institution would email respondents directly to invite them to complete the survey, and reminders would be sent to them.
Option B: The ORP team and partner institution would use the publicly available open scholarship database OpenAlex to identify names and/or email addresses of active researchers who published research within a recent specified time frame and whose last known institutional affiliation was with the institution of interest. A small sample of these researchers would be chosen, and the survey would be emailed to them.
Option C: staff within the institution would circulate the recruitment request and survey link to researchers and postgraduate research students within their institution using local communications channels such as mailing lists, Teams channels, etc.
Among these, Option A was communicated and discussed with ORP partners over several months and was strongly preferred. This gave partner institutions time to liaise with their HESA data teams and navigate their local ethics context. Most of the partners adopted Option A, no partners adopted Option B or Option C. One institution (Royal Veterinary College) circulated the survey to all research staff and students due to the comparatively small size of the institution’s research population. Three decided not to deploy the survey. In total 19 institutions deployed the survey and are included in this report: Cardiff University, Keele University, King’s College London, Newcastle University, Oxford Brookes University, Royal Veterinary College, University College London, University of Bristol, University of Edinburgh, University of Glasgow, University of Leeds, University of Liverpool, University of Oxford, University of Reading, University of Sheffield, University of Southampton, University of Surrey, University of Sussex, University of Wolverhampton.
The detailed sampling strategy and guidance to partner institutions are available on the UKRN Sharing Platform.
2.2.4 Deployment
Preliminary briefings, including background rationale, survey instrument questions, sampling strategy guidance and anticipated timeline were shared with ORP partner institutions during Autumn 2024, followed by a final set of documentation in January 2025, including participant recruitment email, participant information sheet, ethics approval and detailed sampling guidance (see UKRN Sharing Platform).
The survey instrument was hosted on the Qualtrics platform at the University of Bristol, and a link was shared with ORP partners. Three partner institutions added local-specific questions to the survey, which were strictly limited to avoid excessively lengthening the instrument. The adapted survey instruments for each of those three institutions were hosted separately on Qualtrics.
Partner institutions were given from January to July 2025 to open their survey and solicit responses. During this period, we recommended that the survey remain open for at least three weeks. The survey was anonymous and reminders were recommended.
The survey data were analysed using dedicated R scripts developed by the University of Glasgow, to a specification agreed with the central ORP survey team. This generated the aggregate findings described in this report, as well as private institution-specific reports shared with each ORP partner institution, which allowed them to benchmark local OR practices against those in the whole ORP collaboration.
2.3 Findings
2.3.1 Demographics of Survey Respondents
The survey, conducted across 19 UKRN partner institutions (Figure 1), gathered responses from 1,408 researchers spanning 23 disciplines. Research approaches were diverse: 525 respondents primarily used quantitative methods, 269 qualitative, 528 combined both, and 35 used neither (Figure 2). Health and Biosciences was the largest disciplinary group (576 respondents), including the most represented field – medical-allied subjects (191). Physical Sciences had 328 respondents, followed by Social Sciences (301) and Arts and Humanities (154) (Figure 6). Career levels were well distributed, with junior researchers forming the largest group (526), followed by senior (492) and mid-career researchers (343) (Figure 7).
2.3.2 Awareness and Use of Open Research Practices
This section summarises survey findings on researchers’ awareness and use of key OR practices. It focuses on broad patterns across research approaches, disciplines, and career levels.
Overall, the survey results indicate broad awareness and adoption of OR practices among respondents; however, engagement varies considerably across specific practices, research approaches, and disciplines. On average, respondents reported a 31% difference between knowing about a practice and applying it in their work at the high level (Figure 9 and Figure 23). The most significant disparity between awareness and use was observed for replication studies, citizen science, and open code/software, indicating researchers might experience substantial barriers to implementation of these practices despite being relatively aware of them. In contrast, the smallest disparities were for recognising contributions, open access, and ‘open research’, suggesting these practices are more widely understood and readily integrated.
Quantitative and mixed-methods researchers, as well as those in scientific disciplines, generally report higher awareness and use, while qualitative researchers and those in Arts and Humanities disciplines report lower engagement (Figure 10, Figure 17, Figure 24, and Figure 29). Awareness and use also tend to increase with seniority, with some exceptions such as open data, where awareness is similar across career levels (Figure 22 and Figure 31). Overall, this suggests experience and disciplinary norms play a key role in uptake.
2.3.2.1 Open research
Awareness of ‘open research’ as an umbrella term was high (around 85%), with quantitative and mixed-methods researchers reporting the most significant awareness and use (Figure 9 and Figure 16). Awareness and use were consistently strong across detailed methods (generally over 80% awareness and 60–80% use) (Figure 11 and Figure 19).
Disciplinary differences were more pronounced. Archaeology (100%), psychology and neuroscience (97%), veterinary sciences (91%) and computing and information science (91%) showed near-universal awareness of the ‘open research’ umbrella term, while philosophy and religious studies (53%), law (69%), and architecture, building and planning (71%) were lower (Figure 13). Use tended to mirror awareness, with higher engagement in the more aware disciplines (Figure 20). Awareness and use also increased modestly with seniority (Figure 15 and Figure 22).
2.3.2.2 Open access
Open access was the most widely known and used practice, recognised by nearly all respondents (99%) and used by 86% (Figure 9 and Figure 16). Awareness and use were uniformly high across research approaches, research methods, and career levels, though slightly lower among junior researchers (Figure 10, Figure 11, Figure 15, Figure 17, Figure 19, and Figure 22).
Disciplinary variation was minimal, with most fields reporting use of over 73%. The exception was a few humanities and social science areas, such as literature, language and area studies and economics, where use was somewhat lower (69% and 68% respectively) (Figure 20).
2.3.2.3 Preprints
Awareness of preprints was also high (91%), with 62% reporting use or frequent use (Figure 9 and Figure 16). Quantitative and mixed-methods researchers had higher awareness and use than qualitative researchers (Figure 10 and Figure 17).
Across disciplines, awareness ranged from 100% in mathematical sciences and 97% in veterinary sciences to 60% in agriculture and food science and 68% in education and teaching (Figure 13). However, use varied independently of awareness, for instance, some disciplines such as veterinary sciences reported 100% awareness but only 45% have used this method often (Figure 20). Preprint awareness increases slightly by career stage with 85% awareness in junior researchers, 94% in mid-career researchers and 97% in senior researchers (Figure 15). However, a greater disparity is seen in use across career levels with junior researchers having used or using often only around 50% compared to almost 75% for mid-career researchers and senior researchers (Figure 22).
2.3.2.4 Open data and FAIR data
Open data was widely known (94%) and moderately used (62%), while FAIR data had lower awareness (51%) and use (33%) (Figure 9 and Figure 16). Both awareness and use were highest among quantitative researchers, and use increased with seniority (Figure 10, Figure 17, and Figure 22).
Open data awareness was high across all disciplines (ranging from 83%-100%), while FAIR data varied widely, ranging from 26% for philosophy and religious studies to 80% for agriculture and food science (Figure 13). Use patterns were reasonably varied, but often with more substantial use in quantitatively-oriented fields such as physical sciences (73% open data, 44% FAIR data) and computing and information sciences (76% open data, 57% FAIR data) compared to philosophy and religious studies (22%, open data, 0% FAIR data) (Figure 20).
2.3.2.5 Open protocols, open materials and open code/software
Open materials, open protocols and open code/software are all relatively well known, with an overall awareness of 83%, 77%, and 88% respectively (Figure 9). In all three cases, awareness and use are higher in quantitative researchers than qualitative researchers to varying degrees (Figure 10 and Figure 17). It is notable that open materials are better known (76%) and used (34%) by qualitative researchers than open protocols (61% and 20% respectively) and open code/software (61% and 8% respectively).
2.3.2.6 Open peer review
Open peer review was well known (82%) but used by fewer than half of respondents (42%) (Figure 9 and Figure 16). Awareness and use were slightly higher among quantitative and mixed-methods researchers (88% and 83%, respectively) than among qualitative researchers (71%) (Figure 10 and Figure 17). Disciplinary patterns mirrored those seen elsewhere, with greater awareness in Health & Biosciences and lower in Arts & Humanities (Figure 13). Across career levels, awareness is reasonably high, but again use is appreciably lower, with only 52% of senior researchers having used open peer review (Figure 15 and Figure 22).
2.3.2.7 Preregistration and replication studies
Preregistration was among the least known and used practices, with 55% aware and around 25% having used it (Figure 9 and Figure 16). Replication studies were better known (84% awareness) but also less commonly used (28%) (Figure 9 and Figure 16). Both practices had higher awareness (63% preregistration and 90% replication studies) and use (27% preregistration, 36% replication studies) among quantitative researchers (Figure 10 and Figure 17). Awareness and use of preregistration were highest in Health and Biosciences disciplines, particularly psychology and neuroscience (93% awareness and 63% use). Although awareness of replication studies was relatively high across disciplines (45%-100%), use was highest in Health and Biosciences and Physical Sciences (Figure 13 and Figure 20). Awareness and use of both practices did not differ notably by career level (Figure 22).
2.3.2.8 Co-production and citizen science
Co-production had relatively high awareness (80%) and moderate use (54%) (Figure 9 and Figure 16). Awareness was strongest among qualitative researchers (89%) and those in the Arts and Humanities, and Social Sciences (Figure 10 and Figure 13). Citizen science was less known (66% aware) and rarely used (13%) (Figure 9 and Figure 16), mainly concentrated in archaeology (39%), and architecture, building and planning (38%) (Figure 20). Use of both practices was highest at senior levels (Figure 22).
2.3.2.9 Recognising contributions
Awareness of recognising contributions (e.g., CRediT taxonomy) was high (83%), and use substantial (70%) (Figure 9 and Figure 16). Quantitative and mixed-methods researchers were most engaged, in terms of awareness and use (Figure 10 and Figure 17). Use varied across disciplines and tended to rise with career level (Figure 20 and Figure 22).
2.3.3 Attitudes and Facilitators
2.3.3.1 Attitudes
Overall, respondents expressed positive attitudes towards OR. Most agreed that OR is useful (80%), know which practices are relevant to their work (Comprehension, 73%), and expected within their research communities (Norms, 68%), and many felt supported by managers/supervisors (Support, 60%). A notable proportion of respondents expressed a desire to engage more with OR practices than they currently do (‘Participation’ 62%). Fewer (49%) knew where to find further information and guidance (Resources), and only 42% felt that their institutions offered adequate training (Figure 23).
Qualitative researchers tended to be less confident about which OR practices apply to their work, how to implement them, and whether they are expected within their disciplines (Figure 24).
Across all groups, few believed OR practices were considered in hiring or promotion decisions (18%), especially in certain disciplines, such as history and classics, and business and management, where disagreement exceeded 90% (Figure 25).
Finally, no notable differences were highlighted among career levels (Figure 26). However, junior researchers were least confident and most eager to engage more with OR, while senior researchers were more confident but less focused on increasing participation.
2.3.3.2 Facilitators
Interest in potential facilitators was broad and balanced, with practical support (infrastructure, time, guidance, prioritisation, and ethical clarity) rated slightly higher than social or institutional drivers. Few respondents (7%) felt they were already doing all they could (Figure 27).
Guidance was the most consistently endorsed facilitator, followed by incentives and improved time allocation. Training demand was moderate overall, but higher among qualitative and junior researchers (Figure 29 and Figure 31). Senior and mid-career respondents prioritised time, infrastructure, and recognition over training (Figure 31).
2.3.3.3 Links Between Attitudes and Facilitators
Comparing attitudes and facilitators reveals several themes. For example, training: while fewer than 50% of respondents felt institutional training meets researchers’ needs, maybe also due to lack of knowledge on finding resources (49%), relatively few (32%) requested more of it, suggesting a need for better-targeted, visible, and relevant provision (Figure 23 and Figure 27).
Reward and recognition of OR in hiring and promotion remain limited. Most respondents reported that OR is rarely considered in hiring or promotion. While only 30% saw formal recognition in recruitment and promotion as a key facilitator, more respondents (39%) identified incentives such as awards or funding as effective motivators (Figure 23 and Figure 27).
Finally, while confidence around use of OR practices and disciplinary norms was generally relatively high (59% and 68%, respectively), confidence in prioritising which practices matter most in a respondent’s research field emerged as a more significant barrier (40%) in conjunction with time allocation (42%) (Figure 23 and Figure 27), the latter particularly for senior researchers (49%) (Figure 31). These findings suggest that more explicit guidance on prioritisation and better support on time management are key to increasing OR engagement.
2.4 Discussion
The findings from the 2025 ORP survey contribute to a growing body of international evidence on the awareness and adoption of, and attitudes toward, OR practices. They provide insights into the UK context and into institutional-level variation across the ORP network. Overall, the results show high levels of awareness of most OR practices - an encouraging foundation - and persistent structural and cultural barriers to their full embedding.
While many findings align with existing literature and expectations (e.g., Houtkoop et al. 2018; Ferguson et al. 2023; Gopalakrishna et al. 2022; Scoggins and Robertson 2024), several results from the 2025 ORP survey offer novel insights.
Although there was consistently high awareness of open data practices across disciplines and even research methods, we were surprised to find low awareness of FAIR data among participants. Setting this in the context of the wider OR movement, where FAIR data is a key focus, this might highlight a gap in communication or training for researchers.
Perceived wisdom and some research dictate that the most significant barrier to engaging with OR practices is one of time. The wider literature identifies administrative burden as a consistent barrier in the research ecosystem (Gownaris et al. 2022). We were therefore surprised to find that time allocation was not identified as the standout facilitator in our sample. In fact, while many researchers did identify time allocation as a facilitator (42%), a similar proportion of researchers identified a wide range of other facilitators as key. The pattern of responses suggests that, on the whole, facilitators improving know-how or practical concerns (e.g., infrastructure, time allocation, guidance, prioritisation, ethical clarity) were somewhat more popular than facilitators relating to cultural factors (e.g., disciplinary norms, recognition in hiring and promotion, or support from colleagues). This distribution supports the COS pyramid of culture change, which proposes that foundational elements like infrastructure and ease of use should precede normative and policy-level changes, suggesting that capacity building in these areas is still lacking (Cuevas, Errington, and Mellor 2022). However, the relatively balanced endorsement across levels suggests that researchers may simultaneously perceive multiple barriers, and that change strategies should be responsive to this complexity. This is exemplified in the UNESCO Open Science Outlook 1: status and trends around the world (UNESCO 2023), which builds on the COS pyramid of culture change to indicate the often intertwined nature of these areas.
Interestingly, some results appeared to conflict across different sections of the survey. For instance, while a minority of participants felt that their institution provided adequate training, training was not a standout facilitator. This likely reflects nuances in question wording or interpretation, reinforcing the idea that how we measure matters and suggesting that mixed-methods approaches may be needed to understand researcher behaviour and motivation fully. We could interpret these data to infer that, while some researchers believe training is inadequate, they do not think a lack of training is the barrier to OR engagement, which is further supported by broad awareness of OR practices across the whole sample.
Some needs identified are broadly shared across research methods and career levels, such as the desire for more precise guidance. In contrast, others are more specific to particular research approaches or epistemic traditions, such as infrastructure and training.
Our analysis of these data identifies some surprising findings that warrant further investigation. Namely, we noted an unexpectedly high proportion of researchers who claimed to be using specific methods, such as the CRediT taxonomy, to recognise contributions. It is, of course, possible that the use of such methods to recognise contributions is increasing at an unexpected pace, but we note that it is also entirely possible that survey respondents have misinterpreted the definition of “recognising contributions”.
2.4.1 Comparisons with wider literature
The high levels of awareness and use of core practices such as open access, open data, and preprints mirror international trends. Surveys conducted by the COS (2021–2023) and within the European Commission’s Open Science Monitor (2022) similarly found that open access is now near-universal, with open data and preprints following close behind but still constrained by disciplinary norms and infrastructure. The relatively lower awareness and use of practices such as preregistration, open peer review, and replication studies are consistent with this broader evidence, suggesting that the diffusion of OR remains uneven across methods and disciplines. Our data suggest that while there are some disciplinary differences in awareness of OR, the key disciplinary differences are seen in the variable relationship between awareness and use across disciplines. This offers insight into future opportunities for supporting the development of engagement with OR in underrepresented disciplines.
A notable feature of the 2025 ORP data is the contrast between quantitative and qualitative researchers. Similar divides have been reported internationally (e.g., Mozersky et al. 2021; Baždarić et al. 2021), where OR frameworks are often perceived as being more readily applicable to quantitative paradigms. The present findings confirm that this remains a major axis of differentiation in engagement with OR practices. This underscores the importance of current efforts, including those by the ORP and UKRN partners, to articulate how transparency, sharing, and reproducibility can be expressed in diverse epistemic and methodological traditions.
The finding that senior researchers report higher awareness and use than their junior colleagues is also supported by prior studies (e.g., Ferguson et al. 2023; Norris et al. 2022), though it may not hold in all fields [Silverstein-2024]. However, our data provide a nuanced picture: junior researchers express strong motivation to engage more with OR, despite lower confidence and less institutional support. This suggests that early-career researchers may represent a key leverage point for culture change, provided they receive adequate training, mentorship, and clear signals of institutional recognition.
2.4.2 Institutional and systemic factors
As in other studies, institutional support emerges as a decisive factor. The gap between generally positive attitudes toward OR and the relatively low agreement that institutions provide adequate training and resources mirrors findings from European and North American contexts (e.g., European Commission Directorate-General for Research and Innovation et al. 2022) and from other contexts (Chakravorti, Koneru, and Rajtmajer 2025). The limited integration of OR criteria into assessment and promotion processes further reinforces structural disincentives that constrain behavioural change. We note that there is strikingly strong disagreement with assessment and promotion use across our sample and reflect that this might indicate some dissent on this topic that our survey questions did not adequately prompt for. This resonates with the conclusions of the UNESCO Recommendation on Open Science (2021), which identifies reform of research assessment and recognition systems as essential to embedding open science principles. This call is also endorsed by initiatives such as the Declaration on Research Assessment (DORA) and the Coalition for Advancing Research Assessment (CoARA), both of which advocate for more inclusive, transparent, and responsible evaluation practices.
The findings also reinforce the interdependence between institutional and disciplinary norms. In line with work by Scoggins and Robertson (2024), our results suggest that while most researchers understand the relevance of OR to their field, many, especially those in qualitative or applied disciplines, seek greater clarity on prioritisation. We might argue that a greater understanding of which practices are most relevant, and what good practice looks like in a particular research context would be valuable for informing prioritisation. This highlights the value of domain-sensitive guidance and community-led examples, such as those being developed under the ORP and by other international reproducibility networks.
2.4.3 Implications for monitoring and evaluation
The 2025 ORP survey contributes to international efforts to harmonise monitoring of OR practices, including by aligning with the BORS (Norris et al. 2022). The survey is informed by UKRN involvement with a wide variety of monitoring initiatives including the Open Science Monitoring Initiative (OSMI) and the monitoring efforts of the UNESCO Recommendation on Open Science. Preliminary work (publication forthcoming) established the wider consensus on survey monitoring and aimed to embed best-practice in our methodology. In this sense, the 2025 ORP survey is not only a tool for local evaluation but also part of a broader effort to establish shared, comparable metrics for openness and transparency. The adoption of sampling from institutional HESA data marks a distinct methodological approach, offering a practical, privacy-compliant route to generating more representative datasets than open recruitment models typically allow. Our approach, if sustained in 2027, could provide one of the most robust longitudinal datasets on OR practices currently available.
2.4.4 Theoretical and practical implications
Viewed through the lens of behavioural and cultural change, the findings suggest that awareness alone is no longer the main barrier to OR. Instead, the persistence of uneven uptake points to the complex interplay of incentives, infrastructures, disciplinary norms, and perceptions of feasibility, findings echoed by meta-research on open science adoption (e.g., Allen and Mehler 2019; Nosek et al. 2022). Addressing these structural issues may require a dual strategy: embedding OR expectations and recognition within institutional frameworks, while providing discipline-sensitive training, exemplars, and guidance to make OR practices feel relevant and achievable.
2.4.5 Limitations
This survey aimed to gather insights from a wide range of researchers with experience across a variety of research methods and with disciplinary diversity and inclusivity. This offers insight on the disciplinary variation when it comes to engaging with OR practices, but it also raises some limitations particularly around the taxonomy of OR and the disciplinary variations. We are aware that different disciplines have a different relationship to some OR practices and during the survey question design carefully considered ways to mitigate potential variation in understanding and interpretation. Nevertheless, discipline-specific interpretation of OR practices, or indeed discipline-specific terminology for those practices might have influenced the responses given. We also note that these data do not lend themselves to appropriate statistical analysis.
The analysis of sample representativeness (Section 3.4) is, unfortunately, inconclusive. This is likely due to two factors. Firstly, despite best efforts and targeted sampling, response rates remained moderate and so it is likely that self-selection bias remained an important factor. Secondly, the mapping between respondent categories (research field, career level) and the population (HESA) data was very approximate. While the sampling was conducted based on categories available in the population data, the instrument used different sets of categories in order to be consistent with previous surveys (Norris et al. 2022; Hughes-Noehrer et al. 2024). Because the survey was anonymous, we had to rely on respondent self-report for these categories and therefore had to map from those categories to the HESA categories, and this mapping was very approximate and uncertain. We therefore have no strong evidence that the sampling method has produced a more representative sample than that achieved through open sampling. If that is a true reflection of responses, then it would suggest that researchers’ intrinsic motivations to respond remain the main factor in whether they do respond. This would imply that we can expect extrinsic factors such as targeted and personalised sampling to have only limited effects unless adopted in a stronger way than done in this survey, for example by having identifiable responses and thereby being able to personalise non-response follow-up actions.
One possible next step would be to weight the responses according to the population distribution and provide a re-analysis based on those weighted responses. UKRN may undertake that re-analysis if we have capacity.
2.4.6 Reflexivity statement
The 2025 ORP survey was designed, developed and delivered by a team based across UK higher education institutions and involved in the UKRN’s ORP. This means all team members engage in ongoing work to ‘accelerate the adoption of high-quality OR practices’ (UKRN, no date) and were therefore likely to have positive views on OR and its practices. Throughout the design process, team members reflected and consulted on possible limitations to survey questions and options, aiming to be as inclusive as possible across all disciplines, methods, and researchers, whilst acknowledging limitations in the output data that would be comparable to national data and previous surveys.
2.5 Conclusions
The 2025 ORP survey situates UK efforts to advance OR within a maturing global movement. Awareness and uptake of foundational practices are now widespread, yet deeper cultural and systemic changes, particularly around researchers’ and research assessment, incentives, and the adaptation of OR principles to different epistemic traditions, remain works in progress. The findings, therefore, both validate the trajectory of the ORP and highlight areas where further coordination, evidence sharing, and institutional commitment are required to ensure that OR becomes the default across all disciplines and career levels.
2.6 References
3 Supplementary information
Generative AI was employed to assist in developing initial concepts for the Executive Summary (Microsoft Copilot GPT-4) and Discussion section (ChatGPT-5); all final interpretations and conclusions are the author’s own.
3.1 Ethics
The study had ethics approval from the University of Bristol, reference 23351. Some of the participating institutions secured their own supplementary ethics approvals.
3.2 Protocol
The protocol for the ORP Evaluation Design project, of which the survey reported here is a part, is available at https://oercommons.org/courses/orp-evaluation-protocol-2?__hub_id=179
3.3 Materials and data
Study materials, including the survey instrument, consent forms and information sheets are available at https://oercommons.org/courseware/lesson/137449?__hub_id=179
3.4 Annex
The sample representativeness analysis is available here: https://osf.io/k8rtz/files/xegm3
4 Data Visualisation
4.1 Information about Respondents
Our 1408 respondents come from 23 disciplines.
4.1.1 Institutions
4.1.2 Research Approach
4.1.3 Detailed Research Method
4.1.4 Disciplines
4.1.5 Career Level
4.1.6 Career Level by Discipline
4.2 Awareness of Open Research Practices
4.2.1 Awareness Overall
4.2.2 Awareness by Research Approach
4.2.3 Awareness by Detailed Research Method
4.2.4 Awareness by Discipline
4.2.5 Awareness by Discipline by Practice
4.2.6 Awareness by Career Level
4.3 Use of Open Research Practices
4.3.1 Use Overall
See Table 1 for a tabular representation.
4.3.2 Use by Research Approach
See Table 2 for a tabular representation.
4.3.3 Use by Detailed Research Method
See Table 3 for a tabular representation.
4.3.4 Use by Discipline
See Table 4 for a tabular representation.
4.3.5 Use by Discipline by Practice
See Table 4 for a tabular representation.
4.3.6 Use by Career Level
See Table 5 for a tabular representation.
4.4 Attitudes
Respondents were asked to rate their agreement (Strongly agree / Somewhat agree / Neutral / Somewhat disagree / Strongly disagree) with nine statements representing various attitudes about open research practices. The statements are listed below; a one-word label has been added at the beginning of each, to use on the Y-axes of the figures in this section.
- Note that for the question on Assessment, respondents were given a ‘not applicable’ choice as well. These responses (n= 459) were excluded from the figures below.
- Usefulness: I think Open Research is useful for researchers like me
- Participation: I would like to participate in Open Research practices more than I already do
- Confidence: I am confident I know how to use Open Research practices in my research
- Comprehension: I understand which Open Research practices would be relevant to use in my research
- Support: I have support from my line manager/supervisor to use Open Research practices in my research
- Training: My institution provides the training I need for best practice in Open Research
- Norms: Open Research practices are expected in my research community (e.g., among my research group, department, or peers)
- Resources: I know where to go to learn more about Open Research practices
- Assessment: I have used engagement with Open Research as a criterion when assessing someone else for hiring or promotion, or had engagement with Open Research used as a criterion in my own assessment at my institution
4.4.1 Attitudes Overall
See Table 6 for a tabular representation.
4.4.2 Attitudes by Research Approach
See Table 7 for a tabular representation.
4.4.3 Attitudes by Discipline
See Table 8 for a tabular representation.
4.4.4 Attitudes by Career Level
See Table 9 for a tabular representation.
4.5 Facilitators
Respondents were asked ‘What would help you to use more open research practices?’ and could select up to 5 of the response options. The 12 response options are listed below; a one-word label has been added at the beginning of each, to use on the Y-axes of the figures in this section.
- Guidance: More, or better organised, information and guidance
- Training: More training or mentorship
- Ethical clarity: Understanding ethical and legal issues (e.g. issues around data sharing)
- Prioritisation: Knowing what practices to prioritise / what really matters in my field
- Infrastructure: Tools and infrastructure (e.g. sufficient storage for open data)
- Time allocation: More time / workload dedicated to open research
- Incentives: Incentives (e.g., awards / funding) from my funder or institution
- Recognition: Recognition of Open Research in promotion and recruitment criteria
- Peer support: Support from colleagues (e.g., supervisors, students, technicians, administrators, librarians)
- Disciplinary norms: More norms and positive beliefs in my discipline encouraging open research
- No intention: Nothing, I do not plan to take up open research practices
- Already engaged: Nothing, I am already doing everything I think is applicable
4.5.1 Facilitators Overall
See Table 10 for a tabular representation.
4.5.2 Facilitators by Research Approach
See Table 11 for a tabular representation.
4.5.3 Facilitators by Discipline
See Table 12 for a tabular representation.
4.5.4 Facilitators by Career Level
See Table 13 for a tabular representation.