top of page

21 of the RCTs I've summarized since last summer found disappointing effects (e.g., no discernible positive impacts on the primary outcomes). 7 of these RCTs (i.e., one-third) nevertheless portrayed the results as strongly positive in the study abstract. Here are specifics:

  • 4 RCTs found no statistically significant effect on the primary prespecified outcomes, but don't report this fact in the abstract & instead portray results as positive based on other results that aren't reliable (only suggestive) under established scientific standards (FDA, IES).

  • Here are my summaries of the 4 RCTs (including links to the study abstracts): Vision for BaltimoreWorking on WomanhoodCore Knowledge charter schoolsProcedural Justice police training.

  • 2 RCTs found no statistically significant effects after adjusting for their measurement of numerous primary outcomes (which can lead to false-positive findings); yet their abstracts don't mention this fact & instead make strong claims of positive effects.

  • Here are my summaries of these 2 studies (including links to the study abstracts): Padua case management for low-income families, Repairing abandoned housing to prevent crime.  

  • Finally, 1 RCT found a very small, short-term effect that quickly faded over time (it was statistically significant due to a huge sample), yet the abstract presents the results as unambiguously positive. NYC Summer Youth Employment Program (SYEP).

  • The good news is that the other 14 RCTs with disappointing results accurately reported their findings in the study abstract. But, bottom line: inaccurate reporting of disappointing results is common, even in top journals (at least in this limited sample of studies).  

  • PS: I focus on the study abstracts because readers - who may be too busy to review a full study - often rely on the abstract to provide a balanced, impartial overview of the headline results. That's why accuracy of reporting in the abstract is critically important.

bottom of page