Research Article | Open Access
Emma Norris, Yiwei He, Rachel Loh, Robert West, Susan Michie, "Assessing Markers of Reproducibility and Transparency in Smoking Behaviour Change Intervention Evaluations", Journal of Smoking Cessation, vol. 2021, Article ID 6694386, 12 pages, 2021. https://doi.org/10.1155/2021/6694386
Assessing Markers of Reproducibility and Transparency in Smoking Behaviour Change Intervention Evaluations
Introduction. Activities promoting research reproducibility and transparency are crucial for generating trustworthy evidence. Evaluation of smoking interventions is one area where vested interests may motivate reduced reproducibility and transparency. Aims. Assess markers of transparency and reproducibility in smoking behaviour change intervention evaluation reports. Methods. One hundred evaluation reports of smoking behaviour change intervention randomised controlled trials published in 2018-2019 were identified. Reproducibility markers of pre-registration; protocol sharing; data, material, and analysis script sharing; replication of a previous study; and open access publication were coded in identified reports. Transparency markers of funding and conflict of interest declarations were also coded. Coding was performed by two researchers, with inter-rater reliability calculated using Krippendorff’s alpha. Results. Seventy-one percent of reports were open access, and 73% were pre-registered. However, there are only 13% provided accessible materials, 7% accessible data, and 1% accessible analysis scripts. No reports were replication studies. Ninety-four percent of reports provided a funding source statement, and eighty-eight percent of reports provided a conflict of interest statement. Conclusions. Open data, materials, analysis, and replications are rare in smoking behaviour change interventions, whereas funding source and conflict of interest declarations are common. Future smoking research should be more reproducible to enable knowledge accumulation. This study was pre-registered: https://osf.io/yqj5p.
Researchers are becoming increasingly aware of the importance of reproducibility and transparency in scientific research and reporting [1, 2]. A well-documented “replication crisis” in psychology and other disciplines has shown that engrained academic incentives encouraging novel research have led to biased and irreproducible findings [3–6]. Researchers, journals, and funding organisations across psychology and health sciences are contributing to reforming scientific practice to improve the credibility and accessibility of research [1, 7].
“Open Science,” where some or all parts of the research process are made publicly and freely available, is essential for increasing research transparency, credibility, reproducibility, and accessibility . Reproducibility-facilitating research behaviours are varied and occur throughout the research life cycle. During study design, pre-registration and protocols specify the hypotheses, methods, and analysis plan to be used in proposed subsequent research in repositories such as Open Science Framework and AsPredicted. Such specification is designed to reduce researcher degrees of freedom and undisclosed flexibility, ensuring features such as primary and secondary hypotheses and analysis plans remain fixed and preventing “p-hacking” . Within health research, pre-registration and protocol sharing also facilitate future replication and real-world adoption of medical and behavioural interventions . During data analysis, scripts can be made more reproducible by marking their code with step-by-step comments, improving clarity and replication . During dissemination, materials (such as intervention protocols and questionnaires), data, and analysis scripts can be made available by uploading to repositories such as Open Science Framework or GitHub , facilitating the replication of effective research and interventions . Allowing data and trial reports to be made available regardless of their findings enables a more accurate picture of the full state of research, minimising the “file drawer” problem by which positive findings are more likely to be published than negative findings . Sharing data and analysis code also allows for checking of research findings and conclusions, as well as easier synthesis of related findings via meta-analyses . Transparency-facilitating research behaviours include reporting sources of research funding and conflicts of interest [16, 17]. These are important in that they help readers to make informed judgements about potential risks of bias .
Metascience studies have assessed markers of reproducibility and transparency in the related domains of psychology and life sciences. A recent study exploring 250 psychology studies of varying study designs published between 2014 and 2017 found transparency and reproducibility behaviours to be infrequent . Although public availability of studies via open access was common (65%), sharing of research resources was low for materials (14%), raw data (2%), and analysis scripts (1%). Pre-registration (3%) and study protocols (0%) were also infrequent . Transparency of reporting was inconsistent for funding statements (62%) and conflict of interest disclosure statements (39%) . Metascience studies have assessed reproducibility and transparency across other disciplines, including 250 studies in social sciences , 149 studies in biomedicine , and 480 studies across two journals in biostatistics , all with no restrictions on study designs. Other research has focused on the prevalence of specific reproducibility behaviours, such as the prevalence of open access publications, finding about 45% across scientific discipline assessed in 2015 .
However, the extent of reproducibility and transparency behaviours in public health research, including smoking cessation, is currently unclear. A recent investigation of randomised controlled trials addressing addiction found data sharing to be nonexistent. 0/394 trials were found to make their data publicly available, with 31.7% of included trials addressing tobacco addiction . It must be noted that various persistent barriers to data sharing exist, including technical, motivational, economic, political, legal, and ethical considerations (van Panhuis et al., 2014), which may limit the uptake of this specific Open Science behaviour. Markers of wider reproducibility behaviours are yet to be assessed in addiction research.
Transparent reporting in terms of funding and conflicts of interest is especially crucial for smoking cessation, where tobacco and pharmaceutical companies fund some research directly or indirectly . Such vested interests may distort the reporting and interpreting of results, and this may especially be the case in areas of controversy such as e-cigarette research [13, 17, 26, 27]. The aim of the current study is to assess markers of (i) reproducibility and (ii) transparency within smoking intervention evaluation reports.
2.1. Study Design
This was a retrospective observational study with a cross-sectional design. Sampling units were individual behaviour change intervention reports. This study applied a methodology used to assess reproducibility and transparency in the wider psychological sciences  and social sciences  to the context of smoking randomised controlled trial intervention reports. This study was pre-registered: https://osf.io/yqj5p. All deviations from this protocol are explicitly acknowledged in the appendix.
2.2. Sample of Reports
The Cochrane Tobacco Group Specialised Register of controlled trials was searched in November 2019, identifying 1630 reports from 2018 to 2019. Inclusion criteria were randomised controlled trials published in 2018 and 2019. Exclusion criteria were trial protocols, abstract-only entries, and economic or process evaluations. Of the 157 reports remaining after applying these criteria, 100 reports were selected due to time and resource constraints using a random number generator. PDFs were obtained from journal websites. These reports were also already included in the ongoing Human Behaviour-Change Project ([28, 29], https://osf.io/efp4x/), working to synthesis published evidence in behaviour change, beginning with smoking intervention evaluations. A list of all 100 reports included in this study is available: https://osf.io/4pfxm/.
Article characteristics extracted in this study were as follows: (i) 2018 journal impact factor for each report using the Thomson Reuters Journal Citation Reports facility and (ii) country of the corresponding author (Table 1). Additional article characteristics already extracted as part of the Human Behaviour-Change Project are also reported: (iii) smoking outcome behaviour (smoking abstinence, onset, reduction, quit attempt, or second-hand smoking) and (iv) behaviour change techniques (BCTs) in the most complex intervention group, coded using the Behaviour Change Techniques Taxonomy v1 . In short, data from the Human Behaviour-Change Project was extracted using EPPI-Reviewer software  by two independent reviewers before their coding was reconciled and agreed. The full process of manual data extraction within the Human Behaviour-Change Project . All extracted data on included papers is available: https://osf.io/zafyg/.
If a response marked with an asterisk is selected, the coder is asked to provide more detail in a free text response box. Note: identified measured variables have been adapted from a previous study assessing the transparency and reproducibility in psychological sciences .
Markers of research reproducibility were assessed by recording the presence of the following in included reports: (i) pre-registration: whether pre-registration was reported as carried out, where the pre-registration was hosted (e.g., Open Science Framework and AsPredicted), whether it could be accessed, and what aspects of the study were pre-registered; (ii) protocol sharing: whether a protocol was reported as carried out and what aspects of the study were included in the protocol; (iii) data sharing: whether data was available, where it was available (e.g., online repository such as Open Science Framework, upon request from authors, as a journal supplementary file), whether the data was downloadable and accessible, whether data files were clearly documented, and whether data files were sufficient to allow replication of reported findings; (iv) material sharing: whether study materials were available, where they were available (e.g., online repository such as Open Science Framework, upon request from authors, as a journal supplementary file), and whether the materials were downloadable and accessible; (v) analysis script sharing: whether analysis scripts were available, where they were available (e.g., online repository such as Open Science Framework, upon request from authors, as a journal supplementary file), and whether the analysis scripts were downloadable and accessible; (vi) replication of a previous study: whether the study claimed to be a replication attempt of a previous study; and (vii) open access publication: whether the study was published as open access.
Markers of research transparency were assessed by recording the presence of the following in included reports: (i) funding sources: whether funding sources were declared and if research was funded by public organisations (such as research councils or charities), pharmaceutical, tobacco, or other companies; (ii) conflicts of interest: whether conflicts of interest were declared and whether conflicts were with public organisations (such as research councils or charities), pharmaceutical, tobacco, or other companies. All measured variables are shown in Table 1.
Data collection took place between February and March 2020. Data for all measures were extracted onto a Google Form (https://osf.io/xvwjz/). All reports were independently coded by two researchers. Any discrepancies were resolved through discussion, with input from a third researcher if required.
Research reproducibility was assessed using the markers of pre-registration; sharing of protocols, data, materials, and analysis scripts; replication; and open access publishing (Table 1). Research transparency was assessed using the markers of funding source and conflicts of interest declarations. Inter-rater reliability of the independent coding of the two researchers was calculated using Krippendorff’s alpha  using Python 3.6 (https://github.com/HumanBehaviourChangeProject/Automation-InterRater-Reliability).
Inter-rater reliability was assessed as excellent across all coding, . Full data provided on OSF: https://osf.io/sw63b/.
3.1. Sample Characteristics
Seventy-one out of 100 smoking behaviour change intervention reports were published in 2018 and 29 published in 2019. Out of the 100 reports, four had no 2018 journal impact factor, with the remaining 96 reports having impact factors ranging from 0.888 to 70.67 (). Fifty-four out of 100 reports took place in the United States of America (https://osf.io/j2zp3/). Data from the Human Behaviour-Change Project identified that out of the 100 reports, 94 had a primary outcome behaviour of smoking abstinence, two of smoking onset and smoking reduction, respectively, and one of quit attempts and second-hand smoking, respectively. Forty-six out of the total 93 behaviour change techniques (BCTs) within the Behaviour Change Techniques Taxonomy (BCTTv1) were identified in the included reports. An average of 4.41 BCTs was identified in each report. The most commonly identified BCTs were as follows: social support (unspecified) (BCT 3.1, ), pharmacological support (BCT 11.1, ), problem solving (BCT 1.2, ), and goal setting (behaviour) (BCT 1.1, ). A figure of all outcome behaviour and BCT codings can be found: https://osf.io/6w3f4/.
3.2. Markers of Reproducibility in Smoking Behaviour Change Intervention Evaluation Reports
Final reconciled coding of reproducibility and transparency for all smoking behaviour change intervention reports can be found at https://osf.io/jcgx6/.
3.2.1. Article Availability (Open Access)
Seventy-one out of 100 smoking behaviour change intervention reports were available via open access, with 29 only accessible through a paywall (Figure 1(a)).
(a) Article availability
(c) Protocol availability
(d) Material availability
(e) Data availability
(f) Analysis script availability
(g) Replication study
(h) Funding statement
(i) Conflicts of interest
Seventy-three out of 100 smoking behaviour change intervention reports stated that they were pre-registered, with 72 of these being accessible. Fifty-four studies were pre-registered at ClinicalTrials.gov, with the remainder pre-registered at the International Standard Randomized Clinical Trial Number registry (ISRCTN; ), the Australian and New Zealand Clinical Trials Registry (ANZCTR; ), Chinese Clinical Trial Registry (ChCTR; ), Netherlands Trial Register (NTR; ), Iranian Clinical Trials Registry (IRCT; ), Clinical Research Information Service in Korea (CRIS; ), or the UMIN Clinical Trials Registry in Japan (UMIN-CTR; ).
All of the 72 accessible pre-registrations reported methods, with 2 also reporting hypothesis. Only two accessible pre-registrations included hypothesis, methods, and analysis plans. Twenty-six of the 100 reports did not include any statement of pre-registration. One report stated the study was not pre-registered (Figure 1(b)).
3.2.3. Protocol Availability
Seventy-one out of 100 smoking behaviour change intervention reports did not include a statement about protocol availability. For the 29 reports that included accessible protocols, 23 had a protocol that included hypothesis, methods, and analysis plans. Three reports only had methods in their protocol, whereas two of them included both hypothesis and methods, and one of them included methods and analysis plans (Figure 1(c)).
3.2.4. Material Availability
Twenty-two out of 100 reports included a statement saying the intervention materials used were available. Sixteen of these reports provided materials via journal supplementary files, and six reports stated that their materials were only available upon request from the authors (Figure 1(d)).
3.2.5. Data Availability
Sixteen out of 100 reports included a data availability statement. Nine reports stated data was available upon request from the authors, and one stated the data was not available. The remaining six articles included their data in the supplementary files hosted by the journals, but one article’s data file could not be opened. Four of the remaining articles had clearly documented data files, but only two of them contained all necessary raw data. As such in total, only seven reports provided links to data that was actually accessible (Figure 1(e)).
3.2.6. Analysis Script Availability
Three out of 100 reports included an analysis script availability statement. However, only one provided accessible script as a supplementary file, with the remaining two stating analysis script available upon request from authors (Figure 1(f)).
3.2.7. Replication Study
None of the 100 smoking behaviour change intervention reports were described as replication studies (Figure 1(g)).
3.3. Markers of Transparency in Smoking Behaviour Change Intervention Evaluation Reports
Final reconciled coding of reproducibility and transparency markers for all smoking behaviour change intervention reports can be found at https://osf.io/jcgx6/.
Ninety-four of the 100 smoking behaviour change intervention reports included a statement about funding sources. Most of the reports disclosed public funding only such as via government-funded research grants, charities, or universities (). Eight reports disclosed both public funding and funding from private companies. Five reports disclosed funding from private companies only, including pharmaceutical (), tobacco companies (), and other companies (). One report reported receiving no funding (Figure 1(h)).
3.3.2. Conflicts of Interest
Eighty-eight of the 100 articles provided a conflict of interest statement. Most of these reports reported that there were no conflicts of interest (). Thirty-seven reports reported that there was at least one conflict of interest, including from a pharmaceutical company (), private company (), public organisation (), and tobacco company () (Figure 1(i)).
This assessment of 100 smoking behaviour change intervention evaluation reports identified varying levels of research reproducibility markers. Most reports were open access and pre-registered; however, research materials, data, and analysis scripts were not frequently provided and no replication studies were identified. Markers of transparency assessed here by funding source and conflicts of interest declarations were common.
4.1. Assessment of Reproducibility Markers in Smoking Behaviour Change Intervention Evaluation Reports
Pre-registration, as a marker of research reproducibility, was found to be higher for smoking RCTs (73%) than in wider psychological research of varying study designs (3%) . Open access reports were at similarly moderate levels (71%) to psychology (65%) , but greater than the 45% observed in the social sciences , 25% in biomedicine , and 45% across scientific literature published in 2015 . This high rate of open access publishing in smoking interventions may reflect increasing requirements by health funding bodies for funded researchers to publish in open access outlets [34, 35] and increasing usage of preprint publication outlets such as PsyArXiv for the psychological sciences and medRxiv for medical sciences.
The proportion of open materials was lower than in biomedicine (13% vs. 33%)  but similar to the 11% of the social sciences . Open analysis scripts were found to be as infrequently provided in smoking interventions as in wider psychological research (both 1%) , social sciences , and biostatistics .
Open data of smoking interventions was found to be very low (7%), but greater than the 0% estimate in a larger sample of 394 smoking RCTs  and to the 2% of wider psychological research . Raw data are essential for meta-analyses to make sense of the diverse smoking cessation evidence. Common barriers for including studies in meta-analyses include a lack of available data, often after requests from authors [36, 37]. Provision of raw data as supplementary files to published intervention reports or via trusted third-party repositories such as the Open Science Framework  is important to facilitate evidence synthesis, especially in a field as important for global health as smoking cessation.
No replication attempts were identified in this sample of smoking intervention reports, compared to 5% in wider psychology studies  and 1% in the social sciences . This lack of replication may be due to a lack of available resources of smoking interventions to facilitate replication, as identified in this study, or may reflect a lack of research prioritisation and funding for replication, with novel rather than confirmatory research prioritised at global, institutional levels [1, 6].
4.2. Assessment of Transparency Markers in Smoking Behaviour Change Intervention Evaluation Reports
Declaration of funding sources and conflicts of interest, as markers of research transparency, was found here to be commonly provided in smoking intervention evaluation reports. Funding sources were declared in more smoking reports (95%) than wider psychology (62%) , social sciences (31%) , and biomedical science reports (69%) . Similarly, a statement on conflicts of interest was provided more commonly in smoking interventions (88%) than wider psychology (39%) , social sciences (39%) , and biomedical science reports (65%) . Seventeen percent of studies reported conflicts from private companies and 3% from tobacco companies. The comparatively high level of transparency markers observed here in smoking interventions compared to other fields is likely to reflect improved reporting following previous controversies in the field [25, 38, 39]. Funding and disclosure statements are now commonly mandated by journals related to smoking cessation [18, 26, 40].
4.3. Strengths and Limitations
A strength of this study is its use of double coding by two independent researchers of all reproducibility and transparency markers, enabling inter-rater reliability assessment. A limitation is that this study is based on a random sample of 100 evaluation reports of smoking behaviour change interventions, whereby assessments of reproducibility and transparency may not be generalizable to broader smoking interventions. Second, markers of reproducibility and transparency were dependent on what was described within evaluation reports. Direct requests to authors or additional wider searching of third-party registries such as Open Science Framework may have identified additional information indicating reproducibility. The absence of explicit statements on protocol, material, data, and analysis script availability may not necessarily signal that resources will not be shared by authors, but arguably does add an extra step for researchers to seek out this information. Third, this approach of assessing Open Science behaviours in reported research may omit more nuanced approaches to Open Science taken by journals or authors, which may make assessed figures lower than in actual practice.
4.4. Future Steps to Increase Reproducibility and Transparency of Smoking Interventions
Urgent initiatives are needed to address the low levels of reproducibility markers observed here in smoking intervention research, especially in the areas of open materials, data, analysis scripts, and replication attempts. As with any complex behaviour change, this transformation requires system change across bodies involved in smoking cessation research: researchers, research institutions, funding organisations, journals, and beyond [1, 7]. Interventions are needed to increase the capability, opportunity, and motivation of these bodies to facilitate behaviour change towards reproducible research in smoking interventions [28, 41]. For example, capability can be addressed by providing researcher training, equipping them with the skills needed to make their research open and reproducible, such as how to use the Open Science Framework, how to preprint servers, and how to make their analysis reproducible. Opportunity to engage in reproducible research in smoking interventions can be facilitated within institutions, facilitating discussions around open and reproducible working  and developing a culture around valuing progressive and open research behaviours .
Motivation to research reproducibly can be addressed by providing researcher incentives . Open Science badges recognising open data, materials, and pre-registration have been adopted by journals as a simple, low-cost scheme to increase researcher motivation to engage in these reproducibility behaviours . Open Science badges have been identified as the only evidence-based incentive program associated with increased data sharing . However, adoption of Open Science badges in smoking cessation journals is currently low, indicating this as one important initiative currently missing in this field. Future research could compare this study’s baseline assessment of reproducibility and transparency markers in smoking cessation intervention evaluation reports to assess changes in reporting and researcher behaviour.
Reproducibility markers of smoking behaviour change intervention evaluation reports were varied. Pre-registration of research plans and open access publication were common, whereas the provision of open data, materials, and analysis was rare and replication attempts were nonexistent. Transparency markers were common, with funding sources and conflicts of interest usually declared. Urgent initiatives are needed to improve reproducibility in open materials, data, analysis scripts, and replication attempts. Future research can compare this baseline assessment of reproducibility and transparency in the field of smoking interventions to assess changes.
Updates to Preregistered Protocol
During the course of this study and peer review, we made minor adjustments to the preregistered protocol as follows: (1)We revised the remit of “smoking cessation” to instead refer to “smoking behaviour change” more broadly. This allowed inclusion of cessation, reduction, and second-hand smoke intervention reports included within the Human Behaviour-Change Project knowledge system(2)Within the article characteristics measured variables, we added “smoking cessation behaviour” to identify whether each report addressed smoking cessation, reduction, or second-hand smoke specifically(3)Within the article characteristics measured variables, we added “behaviour change techniques” to specify the intervention content identified within each report. Behaviour change techniques were already coded within the parallel Human Behaviour-Change Project: working to synthesis published evidence in behaviour change, beginning with smoking intervention evaluations
All data are provided on OSF: https://osf.io/5rwsq/.
Conflicts of Interest
RW has undertaken research and consultancy for companies that develop and manufacture smoking cessation medications (Pfizer, J&J, and GSK). He is an unpaid advisor to the UK’s National Centre for Smoking Cessation and Training and a director of the not-for-profit Community Interest Company, Unlocking Behaviour Change Ltd. No other competing interests to disclose.
The authors would like to thank Ailbhe N. Finnerty for calculating inter-rater reliability. EN was employed during this study on The Human Behaviour-Change Project, funded by a Wellcome Trust collaborative award (grant number 201,524/Z/16/Z).
- M. R. Munafò, B. A. Nosek, D. V. Bishop et al., “A manifesto for reproducible science,” Nature Human Behaviour, vol. 1, no. 1, pp. 1–9, 2017.
- B. A. Nosek, G. Alter, G. C. Banks et al., “Scientific standards. Promoting an open research culture,” Science, vol. 348, no. 6242, pp. 1422–1425, 2015.
- J. P. Ioannidis, “Why most published research findings are false,” PLoS Medicine, vol. 2, no. 8, article e124, 2005.
- L. K. John, G. Loewenstein, and D. Prelec, “Measuring the prevalence of questionable research practices with incentives for truth telling,” Psychological Science, vol. 23, no. 5, pp. 524–532, 2012.
- B. A. Nosek, J. R. Spies, and M. Motyl, “Scientific utopia II: restructuring incentives and practices to promote truth over publishability,” Perspective on Psychological Science, vol. 7, no. 6, pp. 615–631, 2012.
- Open Science Collaboration, “Estimating the reproducibility of psychological science,” Science, vol. 349, no. 6251, article aac4716, 2015.
- E. Norris and D. B. O’Connor, “Science as behaviour: using a behaviour change approach to increase uptake of open science,” Psychology & Health, vol. 34, no. 12, pp. 1397–1406, 2019.
- U. K. Kathawalla, P. Silverstein, and M. Syed, “Easing Into Open Science: A Tutorial for Graduate Students,” 2020, https://psyarxiv.com/vzjdp/.
- M. L. Head, L. Holman, R. Lanfear, A. T. Kahn, and M. D. Jennions, “The extent and consequences of p-hacking in science,” PLoS Biology, vol. 13, no. 3, article e1002106, 2015.
- A. G. Huebschmann, I. M. Leavitt, and R. E. Glasgow, “Making health research matter: a call to increase attention to external validity,” Annual Review of Public Health, vol. 40, no. 1, pp. 45–63, 2019.
- M. van Vliet, “Seven quick tips for analysis scripts in neuroimaging,” PLoS Computational Biology, vol. 16, no. 3, article e1007358, 2020.
- O. Klein, T. E. Hardwicke, F. Aust et al., “A practical guide for transparency in psychological science,” Collabra: Psychology, vol. 4, no. 1, p. 20, 2018.
- R. Heirene, “A call for replications of addiction research: which studies should we replicate & what constitutes a “successful” replication?” Addiction Research & Theory, vol. 1, pp. 1–9, 2020.
- J. Rotton, P. W. Foos, L. Van Meek, and M. Levitt, “Publication practices and the file drawer problem: a survey of published authors,” Journal of Social Behavior and Personality, vol. 10, no. 1, pp. 1–13, 1995.
- J. S. Ross, “Clinical research data sharing: what an open science world means for researchers involved in evidence synthesis,” Systematic Reviews, vol. 5, no. 1, p. 159, 2016.
- P. B. Fontanarosa, A. Flanagin, and C. D. DeAngelis, “Reporting conflicts of interest, financial aspects of research, and role of sponsors in funded studies,” JAMA, vol. 294, no. 1, pp. 110-111, 2005.
- R. Smith, “Beyond conflict of interest: transparency is the key,” British Medical Journal, vol. 317, no. 7154, pp. 291-292, 1998.
- I.-A. Cristea and J. P. Ioannidis, “Improving disclosure of financial conflicts of interest for research on psychosocial interventions,” JAMA Psychiatry, vol. 75, no. 6, pp. 541-542, 2018.
- T. E. Hardwicke, R. T. Thibault, J. Kosie, J. D. Wallach, M. C. Kidwell, and J. Ioannidis, Estimating the prevalence of transparency and reproducibility-related research practices in psychology (2014-2017), MetaArXiiv, 2020.
- T. E. Hardwicke, J. D. Wallach, M. C. Kidwell, T. Bendixen, S. Crüwell, and J. P. Ioannidis, “An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017),” Royal Society Open Science, vol. 7, no. 2, article 190806, 2019.
- J. D. Wallach, K. W. Boyack, and J. P. Ioannidis, “Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017,” PLoS Biology, vol. 16, no. 11, article e2006930, 2018.
- A. Rowhani-Farid and A. G. Barnett, “Badges for sharing data and code at biostatistics: an observational study,” F1000Research, vol. 7, p. 90, 2018.
- H. Piwowar, J. Priem, V. Larivière et al., “The state of OA: a large-scale analysis of the prevalence and impact of open access articles,” PeerJ, vol. 6, article e4375, 2018.
- M. Vassar, S. Jellison, H. Wendelbo, and C. Wayant, “Data sharing practices in randomized trials of addiction interventions,” Addictive Behaviors, vol. 102, p. 106193, 2020.
- D. Garne, M. Watson, S. Chapman, and F. Byrne, “Environmental tobacco smoke research published in the journal indoor and built environment and associations with the tobacco industry,” The Lancet, vol. 365, no. 9461, pp. 804–809, 2005.
- M. R. Munafò and R. West, “E-cigarette research needs to adopt open science practices to improve quality,” Addiction, vol. 115, no. 1, pp. 3-4, 2020.
- R. West, “Open science and pre-registration of studies and analysis plans,” Addiction, vol. 115, no. 1, pp. 5–5, 2020.
- S. Michie, J. Thomas, M. Johnston et al., “The Human Behaviour-Change Project: harnessing the power of artificial intelligence and machine learning for evidence synthesis and interpretation,” Implementation Science, vol. 12, no. 1, p. 121, 2017.
- S. Michie, J. Thomas, P. Mac Aonghusa et al., “The Human Behaviour-Change Project: An artificial intelligence system to answer questions about changing behaviour,” Wellcome Open Research, vol. 5, no. 122, p. 122, 2020.
- S. Michie, M. Richardson, M. Johnston et al., “The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions,” Annals of Behavioral Medicine, vol. 46, no. 1, pp. 81–95, 2013.
- J. Thomas, J. Brunton, and S. Graziosi, “EPPI-reviewer 4.0: software for research synthesis,” EPPI Centre, London, England, 2010.
- F. Bonin, M. Gleize, A. Finnerty et al., “HBCP corpus: a new resource for the analysis of behavioural change intervention reports,” in Proceedings of The 12th Language Resources and Evaluation Conference, pp. 1967–1975, Marseille, France, 2020.
- A. F. Hayes and K. Krippendorff, “Answering the call for a standard reliability measure for coding data,” Communication Methods and Measures, vol. 1, no. 1, pp. 77–89, 2007.
- A. Severin, M. Egger, M. P. Eve, and D. Hürlimann, “Discipline-specific open access publishing practices and barriers to change: an evidence-based review,” F1000Research, vol. 7, article 1925, 2020.
- J. P. Tennant, F. Waldner, D. C. Jacques, P. Masuzzo, L. B. Collister, and C. H. Hartgerink, “The academic, economic and societal impacts of open access: an evidence-based review,” F1000Research, vol. 5, p. 632, 2016.
- T. Greco, A. Zangrillo, G. Biondi-Zoccai, and G. Landoni, “Meta-analysis: pitfalls and hints,” Heart, Lung and Vessels, vol. 5, no. 4, pp. 219–225, 2013.
- J. P. Ioannidis, N. A. Patsopoulos, and H. R. Rothstein, “Reasons or excuses for avoiding meta-analysis in forest plots,” British Medical Journal, vol. 336, no. 7658, pp. 1413–1415, 2008.
- L. A. Bero, “Tobacco industry manipulation of research,” Public Health Reports, vol. 120, no. 2, pp. 200–208, 2005.
- R. E. Malone and L. Bero, “Chasing the dollar: why scientists should decline tobacco industry funding,” Journal of Epidemiology & Community Health, vol. 57, no. 8, pp. 546–548, 2003.
- D. Nutu, C. Gentili, F. Naudet, and I. A. Cristea, “Open science practices in clinical psychology journals: an audit study,” Journal of Abnormal Psychology, vol. 128, no. 6, pp. 510–516, 2019.
- S. Michie, M. M. van Stralen, and R. West, “The behaviour change wheel: a new method for characterising and designing behaviour change interventions,” Implementation Science, vol. 6, no. 1, p. 42, 2011.
- A. Orben, “A journal club to fix science,” Nature, vol. 573, no. 7775, pp. 465–465, 2019.
- M. C. Kidwell, L. B. Lazarević, E. Baranski et al., “Badges to acknowledge open practices: a simple, low-cost, effective method for increasing transparency,” PLoS Biology, vol. 14, no. 5, article e1002456, 2016.
- A. Rowhani-Farid, M. Allen, and A. G. Barnett, “What incentives increase data sharing in health and medical research? A systematic review,” Research Integrity and Peer Review, vol. 2, no. 1, p. 4, 2017.
Copyright © 2021 Emma Norris et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.