King's College London

Research portal

Quality of stepped-wedge trial reporting can be reliably assessed using an updated CONSORT: crowd-sourcing systematic review

Research output: Contribution to journalReview article

The SW-CRT Review Group, Karla Hemming, Kelly Carroll, Jennifer Thompson, Andrew Forbes, Monica Taljaard, Susan J. Dutton, Vichithranie Madurasinghe, Katy Morgan, Beth Stuart, Katherine Fielding, Victoria Cornelius, Elizabeth L. Turner, Richard Hooper, Bruno Giraudeau, Paul T. Seed, Alecia Nickless, Michael Grayling, Melanie Prague, Sally Kerry & 31 more Lauren Bell, Eila Watson, Rafael Gafoor, Nadine Marlin, Emel Yorganci, Lesley Smith, Murielle Mbekwe, Steven Teerenstra, Claire Chan, Mirjam Moerbeek, Pamela Jacobsen, Simon Bond, Ben Jones, John Preisser, Mona Kanaan, Catherine Hewitt, Christina Easter, Tracy Pellatt-Higgins, Laura Pankhurst, Schadrac C. Agbla, Sandra Eldridge, Robin G. Lerner, Clémence Leyrat, Mark Pilling, Julia R. Forman, Indrani Bhattacharya, Nicholas Magill, Jane Candlish, Cliona McDowell, James Martin, Caroline Kristunas

Original languageEnglish
Pages (from-to)77-88
Number of pages12
JournalJournal of Clinical Epidemiology
Volume107
Early online date28 Nov 2018
DOIs
Publication statusPublished - 1 Mar 2019

Documents

King's Authors

Abstract

Objectives: The Consolidated Standards of Reporting Trials extension for the stepped-wedge cluster randomized trial (SW-CRT) is a recently published reporting guideline for SW-CRTs. We assess the quality of reporting of a recent sample of SW-CRTs. Study Design and Setting: Quality of reporting was asssessed according to the 26 items in the new guideline using a novel crowd sourcing methodology conducted independently and in duplicate, with random assignment, by 50 reviewers. We assessed reliability of the quality assessments, proposing this as a novel way to assess robustness of items in reporting guidelines. Results: Several items were well reported. Some items were very poorly reported, including several items that have unique requirements for the SW-CRT, such as the rationale for use of the design, description of the design, identification and recruitment of participants within clusters, and concealment of cluster allocation (not reported in more than 50% of the reports). Agreement across items was moderate (median percentage agreement was 76% [IQR 64 to 86]). Agreement was low for several items including the description of the trial design and why trial ended or stopped for example. Conclusions: When reporting SW-CRTs, authors should pay particular attention to ensure clear reporting on the exact format of the design with justification, as well as how clusters and individuals were identified for inclusion in the study, and whether this was done before or after randomization of the clusters, which are crucial for risk of bias assessments. Some items, including why the trial ended, might either not be relevant to SW-CRTs or might be unclearly described in the statement.

Download statistics

No data available

View graph of relations

© 2018 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454