Savannah A. Reid, Eric L. Piza, Brandon C. Welsh, and John P. Moylan
Annals of the American Academy of Political and Social Science (2024)
Key Takeaways
- This study analyzed the policy relevance of 151 video surveillance studies using the EMMIE framework
- The analysis measured whether relaxing methodological standards (i.e., not including a comparable control group) enhances the policy relevance of research
- Contrary to some critiques, including lower-rigor studies did not improve policy relevance
- Studies using experimental or high-quality quasi-experimental designs consistently provided more valuable policy insights—particularly on moderators, implementation, and economic costs—than weaker designs
- Both rigorous and less rigorous studies scored poorly on articulating or measuring causal mechanisms, highlighting a broader need to strengthen theory use in evaluation research
Research Summary
This article addresses a critical question in the field of evidence-based crime prevention: Can nonexperimental studies—those lacking experimental or quasi-experimental designs—enhance the policy relevance of crime prevention research? The article is motivated by recent criticisms of evidence-based crime prevention, specifically that prioritizing methodological rigor produces research that lacks insight into critical aspects of crime prevention that are critical to policymakers.
This study examines this issue using a large database of 151 evaluation studies focused on public-area video surveillance (commonly referred to as CCTV). The authors applied the EMMIE framework to both high-rigor (studies using a comparable control condition) and low-rigor (studies lacking a comparable control condition) CCTV evaluation studies. The EMMIE framework assesses five dimensions of evidence: Effects, Mechanisms, Moderators, Implementation, and Economics. The authors focused on the latter four dimensions, which are particularly relevant to policymaking. Each study received a “Q Score” from 0 to 2 on each dimension based on how thoroughly it addressed those aspects. For example, a score of 2 in economics indicates a full cost-benefit analysis, while a 0 indicates no mention of costs.
The analysis revealed a clear pattern: studies with stronger methodological designs consistently provided more policy-relevant information across three of the four dimensions. Specifically, these studies more frequently and more thoroughly reported on contextual moderators, implementation processes, and economic costs. There was no significant difference between high- and low-rigor studies in the “mechanisms” category, suggesting that causal theory and explanation are underdeveloped across the board.
Importantly, the results counter the idea that relaxing methodological rigor would yield a richer or more useful evidence base for practitioners. On the contrary, rigorous studies not only provide stronger causal claims but also better address the practical concerns of policymakers, such as how interventions function in context, what resources are required, and what the financial return on investment might be.
The results suggest that weakening methodological standards—such as by incorporating uncontrolled before-and-after studies—risks diminishing the reliability and utility of the evidence. While the authors acknowledge the limitations of a narrow focus on outcomes alone, they emphasize that expanding policy relevance should not come at the expense of internal validity. Instead, the research community should aim for a “second generation” of evidence-based studies that maintain rigor while also incorporating richer contextual and practical information.
Several important recommendations emerge from this study. First, systematic reviews should maintain strict methodological standards to ensure that only high-quality evidence informs crime policy. Second, researchers should improve how they report and analyze moderators, implementation details, and economic outcomes. Third, researchers should develop new platforms and tools—such as policy briefings or practitioner-focused publications—to make rigorous research more accessible and usable in practice.
