Data and Syntax for Investigating Science Education Effect Sizes: Implications for Power Analyses and Programmatic Decisions
Principal Investigator(s): View help for Principal Investigator(s) Susan M. Kowalski, BSCS Science Learning; Joseph A. Taylor, University of Colorado, Colorado Springs
Version: View help for Version V1
Version Title: View help for Version Title Version 1.0
Name | File Type | Size | Last Modified |
---|---|---|---|
Data and Syntax for Investigating Science Education Effect Sizes: Implications for Power Analyses and Programmatic Decisions | 0 |
Project Citation:
Kowalski, Susan M., and Taylor, Joseph A. Data and Syntax for Investigating Science Education Effect Sizes: Implications for Power Analyses and Programmatic Decisions. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2019-03-23. https://doi.org/10.3886/E109061V1
Project Description
Summary:
View help for Summary
A priori power analyses allow researchers to estimate the
number of participants needed to detect the effects of an intervention.
However, power analyses are only as valid as the parameter estimates used. One
such parameter, the expected effect size, can vary greatly depending on several
study characteristics, including the nature of the intervention, developer of
the outcome measure, and age of the participants. Researchers should understand
this variation when designing studies. Our meta-analysis examines the
relationship between science education intervention effect sizes and a host of
study characteristics, allowing primary researchers to access better estimates
of effect sizes for a priori power analyses. The results of this meta-analysis
also support programmatic decisions by setting realistic expectations about the
typical magnitude of impacts for science education interventions.
The primary research question of this meta-analysis is: What is the
relationship between the magnitude of the intervention effects and key
study characteristics? The study characteristics of interest included
the design (randomized studies compared to matched quasi-experimental
studies), whether the outcome measure was developed by the study
authors, who receives the intervention (e.g., students only, teachers
only, both students and teachers), the science discipline targeted by
the intervention, the treatment provider’s role (e.g., researcher or
teacher), and the grade level of the students.
Scope of Project
Subject Terms:
View help for Subject Terms
effect size;
meta-analysis;
program evaluation;
science education;
students
Universe:
View help for Universe
Manuscripts associated with studies of science education interventions conducted internationally, published in English.
Methodology
Data Source:
View help for Data Source
Our data sources were manuscripts from 96 primary studies of science education interventions with primary and secondary students (published and unpublished). A complete description of eligibility criteria for included manuscripts is found in the published paper associated with these data.
Weights:
View help for Weights
We made several adjustments to the data, including Winsorization of sample sizes, Winsorization of effect sizes, grand mean centering, and weighting effect by the inverse variance.
Unit(s) of Observation:
View help for Unit(s) of Observation
effect sizes
Related Publications
Published Versions
Report a Problem
Found a serious problem with the data, such as disclosure risk or copyrighted content? Let us know.
This material is distributed exactly as it arrived from the data depositor. ICPSR has not checked or processed this material. Users should consult the investigator(s) if further information is desired.