Professional Education


  • PhD, University of Rennes, France, Clinical Epidemiology (2022)
  • MSc, University Claude Bernard, Lyon, France, Clinical Evaluation (2018)
  • PharmD, University of Regensburg, Germany, Pharmacy (2017)

Stanford Advisors


All Publications


  • Lifting of Embargoes to Data Sharing in Clinical Trials Published in Top Medical Journals. JAMA Siebert, M., Ioannidis, J. P. 2023

    View details for DOI 10.1001/jama.2023.25394

    View details for PubMedID 38153703

  • Towards transparency: adoption of WHO best practices in clinical trial registration and reporting among top medical research funders in the USA. BMJ evidence-based medicine Gamertsfelder, E., Delgado Figueroa, N., Keestra, S., Silva, A. R., Borana, R., Siebert, M., Bruckner, T. 2023

    Abstract

    OBJECTIVE: To assess to what extent the clinical trial policies of the largest public and philanthropic funders of clinical research in the United States meet WHO best practices in trial registration and reporting.METHODS: Public and philanthropic funders of clinical trials in the USA with >US$50million annual spend were selected. The funders were assessed using an 11-item scoring tool based on WHO Joint Statement benchmarks. These 11 items fell into 4 categories, namely: trial registration, academic publication, monitoring and sanctions. An additional item captured whether and how funders referred to Consolidated Standards of Reporting Trials (CONSORT) within their trial policies. Each funder was independently assessed by two or three researchers. Funders were contacted to flag possible errors and omissions. Ambiguous or difficult-to-score items were settled by an independent adjudicator.RESULTS: Fourteen funders were assessed. Our cross-sectional study found that, on average, funders have only implemented 4.1/11 (37%) of WHO best practices in clinical trial transparency. The most frequently adopted requirement was open access publishing (14/14 funders). The least frequently adopted were (1) requiring trial ID to appear in all publications (2/14 funders, 14%) and (2) making compliance reports public (2/14 funders, 14%). Public funders, on average, adopted more policy elements (5.2/11 items, 47%) than philanthropic funders (2.8/11 items, 25%). Only one funder's policy documents mentioned the CONSORT statement.CONCLUSIONS: There is a significant variation between the number of best practice policy items adopted by medical research funders in the USA. Many funders fell significantly short of WHO Joint Statement benchmarks. Each funder could benefit from policy revision and strengthening.

    View details for DOI 10.1136/bmjebm-2023-112395

    View details for PubMedID 37932014

  • Industry Involvement and Transparency in the Most Cited Clinical Trials, 2019-2022. JAMA network open Siena, L. M., Papamanolis, L., Siebert, M. J., Bellomo, R. K., Ioannidis, J. P. 2023; 6 (11): e2343425

    Abstract

    Importance: Industry involvement is prominent in influential clinical trials, and commitments to transparency of trials are highly variable.Objective: To evaluate the modes of industry involvement and the transparency features of the most cited recent clinical trials across medicine.Design, Setting, and Participants: This cross-sectional study was a meta-research assessment including randomized and nonrandomized clinical trials published in 2019 or later. The 600 trials of any type of disease or setting that attracted highest number of citations in Scopus as of December 2022 were selected for analysis. Data were analyzed from March to September 2023.Main Outcomes and Measures: Outcomes of interest were industry involvement (sponsor, author, and analyst) and transparency (protocols, statistical analysis plans, and data and code availability).Results: Among 600 trials with a median (IQR) sample size of 415 (124-1046) participants assessed, 409 (68.2%) had industry funding and 303 (50.5%) were exclusively industry-funded. A total of 354 trials (59.0%) had industry authors, with 280 trials (46.6%) involving industry analysts and 125 trials (20.8%) analyzed exclusively by industry analysts. Among industry-funded trials, 364 (89.0%) reached conclusions favoring the sponsor. Most trials (478 trials [79.7%]) provided a data availability statement, and most indicated intention to share the data, but only 16 trials (2.7%) had data already readily available to others. More than three-quarters of trials had full protocols (482 trials [82.0%]) or statistical analysis plans (446 trials [74.3%]) available, but only 27 trials (4.5%) explicitly mentioned sharing analysis code (8 readily available; 19 on request). Randomized trials were more likely than nonrandomized studies to involve only industry analysts (107 trials [22.9%] vs 18 trials [13.6%]; P=.02) and to have full protocols (405 studies [86.5%] vs 87 studies [65.9%]; P<.001) and statistical analysis plans (373 studies [79.7%] vs 73 studies [55.3%]; P<.001) available. Almost all nonrandomized industry-funded studies (90 of 92 studies [97.8%]) favored the sponsor. Among industry-funded trials, exclusive industry funding (odds ratio, 2.9; 95% CI, 1.5-5.4) and industry-affiliated authors (odds ratio, 2.9; 95% CI, 1.5-5.6) were associated with favorable conclusions for the sponsor.Conclusions and Relevance: This cross-sectional study illustrates how industry involvement in the most influential clinical trials was prominent not only for funding, but also authorship and provision of analysts and was associated with conclusions favoring the sponsor. While most influential trials reported that they planned to share data and make both protocols and statistical analysis plans available, raw data and code were rarely readily available.

    View details for DOI 10.1001/jamanetworkopen.2023.43425

    View details for PubMedID 37962883

  • Assessing the magnitude of changes from protocol to publication-a survey on Cochrane and non-Cochrane Systematic Reviews. PeerJ Siebert, M., Caquelin, L., Madera, M., Acosta-Dighero, R., Naudet, F., Roqué, M. 2023; 11: e16016

    Abstract

    To explore differences between published reviews and their respective protocols in a sample of 97 non-Cochrane Systematic Reviews (non-CSRs) and 97 Cochrane Systematic Reviews (CSRs) in terms of PICOS (Patients/Population, Intervention, Comparison/Control, Outcome, Study type) elements and the extent to which they were reported.We searched PubMed and Cochrane databases to identify non-CSRs and CSRs that were published in 2018. We then searched for their corresponding Cochrane or PROSPERO protocols. The published reviews were compared to their protocols. The primary outcome was changes from protocol to review in terms of PICOS elements.We identified a total of 227 changes from protocol to review in PICOS elements, 1.11 (Standard Deviation (SD), 1.22) changes per review for CSRs and 1.23 (SD, 1.12) for non-CSRs per review. More than half of each sub-sample (54.6% of CSRs and 67.0% of non-CSRs) (Absolute Risk Reduction (ARR) 12.4% [-1.3%; 26.0%]) had changes in PICOS elements. For both subsamples, approximately a third of all changes corresponded to changes related to primary outcomes. Marked differences were found between the sub-samples for the reporting of changes. 95.8% of the changes in PICOS items were not reported in the non-CSRs compared to 42.6% in the CSRs (ARR 53.2% [43.2%; 63.2%]).CSRs showed better results than non-CSRs in terms of the reporting of changes. Reporting of changes from protocol needs to be promoted and requires general improvement. The limitations of this study lie in its observational design. Registration: https://osf.io/6j8gd/.

    View details for DOI 10.7717/peerj.16016

    View details for PubMedID 37810785

    View details for PubMedCentralID PMC10552742

  • Peer review before trial conduct could increase research value and reduce waste. Journal of clinical epidemiology Siebert, M., Naudet, F., Ioannidis, J. P. 2023

    Abstract

    Traditional peer-review of clinical trials happens too late, after the trials are already done. However, lack of methodological rigor and presence of many biases can be detected and remedied in advance. Here, we examine several options for review and improvement of trials before their conduct: protocol review by peers, sponsors, regulatory authorities, and institutional ethical committees; registration in registry sites; deposition of protocol and/or the statistical analysis plan in a public repository; peer-review and publication of the protocol and/or the statistical analysis plan in a journal; and Registered Reports. Some practices are considered standard (e.g. registration in trial registry), while others are still uncommon but are becoming more frequent (e.g. publication of full trial protocols and statistical analysis plans). Ongoing challenges hinder a large-scale implementation of some promising practices such as Registered Reports. Innovative ideas are necessary to advance peer-review efficiency and rigor in clinical trials but also to lower the cumulative burden for peer-reviewers. We make several suggestions to enhance pre-conduct peer-review. Making all steps of research process public and open may reverse siloed environments. Pre-conduct peer-review may be improved by making routinely publicly available all protocols that have gone through review by institutional review boards and regulatory agencies.

    View details for DOI 10.1016/j.jclinepi.2023.05.024

    View details for PubMedID 37286150

  • Ten simple rules for implementing open and reproducible research practices after attending a training course. PLoS computational biology Heise, V., Holman, C., Lo, H., Lyras, E. M., Adkins, M. C., Aquino, M. R., Bougioukas, K. I., Bray, K. O., Gajos, M., Guo, X., Hartling, C., Huerta-Gutierrez, R., Jindrová, M., Kenney, J. P., Kępińska, A. P., Kneller, L., Lopez-Rodriguez, E., Mühlensiepen, F., Richards, A., Richards, G., Siebert, M., Smith, J. A., Smith, N., Stransky, N., Tarvainen, S., Valdes, D. S., Warrington, K. L., Wilpert, N. M., Witkowska, D., Zaneva, M., Zanker, J., Weissgerber, T. L. 2023; 19 (1): e1010750

    Abstract

    Open, reproducible, and replicable research practices are a fundamental part of science. Training is often organized on a grassroots level, offered by early career researchers, for early career researchers. Buffet style courses that cover many topics can inspire participants to try new things; however, they can also be overwhelming. Participants who want to implement new practices may not know where to start once they return to their research team. We describe ten simple rules to guide participants of relevant training courses in implementing robust research practices in their own projects, once they return to their research group. This includes (1) prioritizing and planning which practices to implement, which involves obtaining support and convincing others involved in the research project of the added value of implementing new practices; (2) managing problems that arise during implementation; and (3) making reproducible research and open science practices an integral part of a future research career. We also outline strategies that course organizers can use to prepare participants for implementation and support them during this process.

    View details for DOI 10.1371/journal.pcbi.1010750

    View details for PubMedID 36602968

  • Data-sharing and re-analysis for main studies assessed by the European Medicines Agency-a cross-sectional study on European Public Assessment Reports BMC MEDICINE Siebert, M., Gaba, J., Renault, A., Laviolle, B., Locher, C., Moher, D., Naudet, F. 2022; 20 (1): 177

    Abstract

    Transparency and reproducibility are expected to be normative practices in clinical trials used for decision-making on marketing authorisations for new medicines. This registered report introduces a cross-sectional study aiming to assess inferential reproducibility for main trials assessed by the European Medicines Agency.Two researchers independently identified all studies on new medicines, biosimilars and orphan medicines given approval by the European Commission between January 2017 and December 2019, categorised as 'main studies' in the European Public Assessment Reports (EPARs). Sixty-two of these studies were randomly sampled. One researcher retrieved the individual patient data (IPD) for these studies and prepared a dossier for each study, containing the IPD, the protocol and information on the conduct of the study. A second researcher who had no access to study reports used the dossier to run an independent re-analysis of each trial. All results of these re-analyses were reported in terms of each study's conclusions, p-values, effect sizes and changes from the initial protocol. A team of two researchers not involved in the re-analysis compared results of the re-analyses with published results of the trial.Two hundred ninety-two main studies in 173 EPARs were identified. Among the 62 studies randomly sampled, we received IPD for 10 trials. The median number of days between data request and data receipt was 253 [interquartile range 182-469]. For these ten trials, we identified 23 distinct primary outcomes for which the conclusions were reproduced in all re-analyses. Therefore, 10/62 trials (16% [95% confidence interval 8% to 28%]) were reproduced, as the 52 studies without available data were considered non-reproducible. There was no change from the original study protocol regarding the primary outcome in any of these ten studies. Spin was observed in the report of one study.Despite their results supporting decisions that affect millions of people's health across the European Union, most main studies used in EPARs lack transparency and their results are not reproducible for external researchers. Re-analyses of the few trials with available data showed very good inferential reproducibility.https://osf.io/mcw3t/.

    View details for DOI 10.1186/s12916-022-02377-2

    View details for Web of Science ID 000797969400001

    View details for PubMedID 35590360

    View details for PubMedCentralID PMC9119701

  • Medical journal requirements for clinical trial data sharing: Ripe for improvement. PLoS medicine Naudet, F., Siebert, M., Pellen, C., Gaba, J., Axfors, C., Cristea, I., Danchev, V., Mansmann, U., Ohmann, C., Wallach, J. D., Moher, D., Ioannidis, J. P. 2021; 18 (10): e1003844

    Abstract

    Florian Naudet and co-authors discuss strengthening requirements for sharing clinical trial data.

    View details for DOI 10.1371/journal.pmed.1003844

    View details for PubMedID 34695113

  • An open science pathway for drug marketing authorization-Registered drug approval PLOS MEDICINE Naudet, F., Siebert, M., Boussageon, R., Cristea, I. A., Turner, E. H. 2021; 18 (8): e1003726

    Abstract

    Florian Naudet and co-authors propose a pathway involving registered criteria for evaluation and approval of new drugs.

    View details for DOI 10.1371/journal.pmed.1003726

    View details for Web of Science ID 000683138300001

    View details for PubMedID 34370737

    View details for PubMedCentralID PMC8351924

  • Status, use and impact of sharing individual participant data from clinical trials: a scoping review BMJ OPEN Ohmann, C., Moher, D., Siebert, M., Motschall, E., Naudet, F. 2021; 11 (8): e049228

    Abstract

    To explore the impact of data-sharing initiatives on the intent to share data, on actual data sharing, on the use of shared data and on research output and impact of shared data.All studies investigating data-sharing practices for individual participant data (IPD) from clinical trials.We searched the Medline database, the Cochrane Library, the Science Citation Index Expanded and the Social Sciences Citation Index via Web of Science, and preprints and proceedings of the International Congress on Peer Review and Scientific Publication. In addition, we inspected major clinical trial data-sharing platforms, contacted major journals/publishers, editorial groups and some funders.Two reviewers independently extracted information on methods and results from resources identified using a standardised questionnaire. A map of the extracted data was constructed and accompanied by a narrative summary for each outcome domain.93 studies identified in the literature search (published between 2001 and 2020, median: 2018) and 5 from additional information sources were included in the scoping review. Most studies were descriptive and focused on early phases of the data-sharing process. While the willingness to share IPD from clinical trials is extremely high, actual data-sharing rates are suboptimal. A survey of journal data suggests poor to moderate enforcement of the policies by publishers. Metrics provided by platforms suggest that a large majority of data remains unrequested. When requested, the purpose of the reuse is more often secondary analyses and meta-analyses, rarely re-analyses. Finally, studies focused on the real impact of data-sharing were rare and used surrogates such as citation metrics.There is currently a gap in the evidence base for the impact of IPD sharing, which entails uncertainties in the implementation of current data-sharing policies. High level evidence is needed to assess whether the value of medical research increases with data-sharing practices.

    View details for DOI 10.1136/bmjopen-2021-049228

    View details for Web of Science ID 000692968300009

    View details for PubMedID 34408052

    View details for PubMedCentralID PMC8375721

  • The research output on interventions for the behavioural risk factors alcohol & drug use and dietary risk is not related to their respective burden of ill health in countries at differing World Bank income levels JOURNAL OF GLOBAL HEALTH Frassetto, C., Madera, M., Siebert, M., Megranahan, K., Roberts, D., Plugge, E. 2020; 10 (2): 020401

    Abstract

    Alcohol and drug use (A&D) and dietary risks are two increasingly important risk factors. This study examines whether there is a relationship between the burden of these risk factors in countries of specific income bands as defined by the World Bank, and the number of primary studies included in Cochrane Systematic Reviews (CSRs) conducted in those countries.Data was extracted from primary studies included in CSRs assessing two risk factors as outcomes. For each risk factor, data was obtained on its overall burden in disability-adjusted life years (DALYs) by World Bank Income Levels and examined for a link between DALYs, the number of primary studies and participants.A total of 1601 studies from 95 CSRs were included. Only 18.3% of the global burden for A&D is in high income-countries (HICs) but they produced 90.5% of primary studies and include 99.5% of participants. Only 14.2% of the dietary risk burden is in HICs but they produced 80.5% of primary studies and included 98.1% of participants.This study demonstrates the unequal output of research heavily weighted towards HICs. More initiatives with informed contextual understanding are required to address this inequality and promote health research in low and middle-income countries.

    View details for DOI 10.7189/jogh.10.020401

    View details for Web of Science ID 000612476300114

    View details for PubMedID 33110568

    View details for PubMedCentralID PMC7520876

  • Funders' data-sharing policies in therapeutic research: A survey of commercial and non-commercial funders PLOS ONE Gaba, J., Siebert, M., Dupuy, A., Moher, D., Naudet, F. 2020; 15 (8): e0237464

    Abstract

    Funders are key players in supporting randomized controlled trial (RCT) data-sharing. This research aimed to describe commercial and non-commercial funders' data-sharing policies and to assess the compliance of funded RCTs with the existing data-sharing policies.Funders of clinical research having funded at least one RCT in the years 2016 to 2018 were surveyed. All 78 eligible non-commercial funders retrieved from the Sherpa/Juliet Initiative website and a random sample of 100 commercial funders selected from pharmaceutical association member lists (LEEM, IFPMA, EFPIA) and the top 100 pharmaceutical companies in terms of drug sales were included. Thirty (out of 78; 38%) non-commercial funders had a data-sharing policy with eighteen (out of 30, 60%) making data-sharing mandatory and twelve (40%) encouraging data-sharing. Forty-one (out of 100; 41%) of commercial funders had a data-sharing policy. Among funders with a data-sharing policy, a survey of two random samples of 100 RCTs registered on Clinicaltrial.gov, data-sharing statements were present for seventy-seven (77%, 95% IC [67%-84%]) and eighty-one (81% [72% - 88%]) of RCTs funded by non-commercial and commercial funders respectively. Intention to share data was expressed in 12% [7%-20%] and 59% [49%- 69%] of RCTs funded by non-commercial and commercial funders respectively.This survey identified suboptimal performances of funders in setting up data-sharing policies. For those with a data-sharing policy, the implementation of the policy in study registration was limited for commercial funders and of concern for non-commercial funders. The limitations of the present study include its cross-sectional nature, since data-sharing policies are continuously changing. We call for a standardization of policies with a strong evaluation component to make sure that, when in place, these policies are effective.

    View details for DOI 10.1371/journal.pone.0237464

    View details for Web of Science ID 000564315600059

    View details for PubMedID 32817724

    View details for PubMedCentralID PMC7446799

  • Data-sharing recommendations in biomedical journals and randomised controlled trials: an audit of journals following the ICMJE recommendations BMJ OPEN Siebert, M., Gaba, J., Caquelin, L., Gouraud, H., Dupuy, A., Moher, D., Naudet, F. 2020; 10 (5): e038887

    Abstract

    To explore the implementation of the International Committee of Medical Journal Editors (ICMJE) data-sharing policy which came into force on 1 July 2018 by ICMJE-member journals and by ICMJE-affiliated journals declaring they follow the ICMJE recommendations.A cross-sectional survey of data-sharing policies in 2018 on journal websites and in data-sharing statements in randomised controlled trials (RCTs).ICMJE website; PubMed/Medline.ICMJE-member journals and 489 ICMJE-affiliated journals that published an RCT in 2018, had an accessible online website and were not considered as predatory journals according to Beall's list. One hundred RCTs for member journals and 100 RCTs for affiliated journals with a data-sharing policy, submitted after 1 July 2018.The primary outcome for the policies was the existence of a data-sharing policy (explicit data-sharing policy, no data-sharing policy, policy merely referring to ICMJE recommendations) as reported on the journal website, especially in the instructions for authors. For RCTs, our primary outcome was the intention to share individual participant data set out in the data-sharing statement.Eight (out of 14; 57%) member journals had an explicit data-sharing policy on their website (three were more stringent than the ICMJE requirements, one was less demanding and four were compliant), five (35%) additional journals stated that they followed the ICMJE requirements, and one (8%) had no policy online. In RCTs published in these journals, there were data-sharing statements in 98 out of 100, with expressed intention to share individual patient data reaching 77 out of 100 (77%; 95% CI 67% to 85%). One hundred and forty-five (out of 489) ICMJE-affiliated journals (30%; 26% to 34%) had an explicit data-sharing policy on their website (11 were more stringent than the ICMJE requirements, 85 were less demanding and 49 were compliant) and 276 (56%; 52% to 61%) merely referred to the ICMJE requirements. In RCTs published in affiliated journals with an explicit data-sharing policy, data-sharing statements were rare (25%), and expressed intentions to share data were found in 22% (15% to 32%).The implementation of ICMJE data-sharing requirements in online journal policies was suboptimal for ICMJE-member journals and poor for ICMJE-affiliated journals. The implementation of the policy was good in member journals and of concern for affiliated journals. We suggest the conduct of continuous audits of medical journal data-sharing policies in the future.The protocol was registered before the start of the research on the Open Science Framework (https://osf.io/n6whd/).

    View details for DOI 10.1136/bmjopen-2020-038887

    View details for Web of Science ID 000738373200104

    View details for PubMedID 32474433

    View details for PubMedCentralID PMC7264700