top of page

Science Magazine published an RCT at 22 US colleges, reporting that a brief "social belonging" writing exercise increased 1st-year college completion rates. Quick take: These claims are based on post-hoc (vs prespecified) analyses that aren't reliable under established scientific standards.

Program & Study Design:

  • The intervention is a 10-30 min web-based writing exercise for incoming 1st yr students, designed to address their worries about belonging. The study randomized 27K students at 22 colleges to treatment (the social belonging exercise) vs control (an unrelated writing exercise).



  • The study abstract (below) says the intervention increased rates of 1st-year full-time college completion, especially among students in groups that had historically progressed at lower rates, & it was effective only at colleges providing opportunities to belong.

  • But the study doesn't actually report effects for the full sample (as implied by the 1st highlighted clause), & the two subgroup effects - for groups with lower historical progression & colleges providing high opportunities to belong - were post-hoc (i.e., not prespecified).

  • The study pre-specification had hypothesized positive intervention effects for other subgroups: societally disadvantaged students (e.g., Blacks), groups self-reporting "higher threat" (e.g., worry that people judge you by race), & colleges providing low opportunities to belong (see relevant excerpt below).

  • The study does not report findings for the first 2 of these pre-specified subgroups & it reports no significant effects in the 3rd (i.e., colleges providing low opportunities to belong). It also reports effects on only 1 of 2 prespecified outcome measures (1st yr completion but not GPA).



  • Such selective & post-hoc reporting (whether intentional or not) can easily create a false appearance of program effectiveness - yet is too common in social policy, even in top journals like Science.

bottom of page