You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a set of studies that I’m trying to meta-analyze with OpenMeta[Analyst]. I have raw means and SD but would like to analyze standardized mean differences (Cohen’s d) for ease of interpretation (especially because the DV is in squared units).
Here’s the data I inputed into OpenMeta[Analyst]…the last 3 columns are the standard mean differences calculated, with upper and lower bounds for CI:
When I try analyzing the same data with a different tool (Meta-Essentials), however, I get:
Study name
Cohen's d
CI Lower limit
CI Upper limit
Online
0.34
0.14
0.53
College of DuPage
0.49
0.23
0.76
Dominican University
0.30
0.00
0.60
Concordia University
0.26
-0.06
0.57
Notice that this tool gives considerably higher effect size estmates, especially for the first two studies (0.278 vs 0.34 for Online study; 0.38 vs 0.49 for College of DuPage study).
Why is this? I’ve checked using a third online calculator using both the raw means and the t test results and my results always agree with the higher effect size estimates from Meta-Essentials. What’s strang, though, is that in other data-sets I’ve analyzed wth OpenMeta[Analyst] I get no discrepancy… not sure if there is something special happening here in OpenMeta[Analyst] in terms of pooling variance across all studies to estimate effect size? Any hints?
Thanks,
Bob
The text was updated successfully, but these errors were encountered:
I have a set of studies that I’m trying to meta-analyze with OpenMeta[Analyst]. I have raw means and SD but would like to analyze standardized mean differences (Cohen’s d) for ease of interpretation (especially because the DV is in squared units).
Here’s the data I inputed into OpenMeta[Analyst]…the last 3 columns are the standard mean differences calculated, with upper and lower bounds for CI:
When I try analyzing the same data with a different tool (Meta-Essentials), however, I get:
Study name
Cohen's d
CI Lower limit
CI Upper limit
Online
0.34
0.14
0.53
College of DuPage
0.49
0.23
0.76
Dominican University
0.30
0.00
0.60
Concordia University
0.26
-0.06
0.57
Notice that this tool gives considerably higher effect size estmates, especially for the first two studies (0.278 vs 0.34 for Online study; 0.38 vs 0.49 for College of DuPage study).
Why is this? I’ve checked using a third online calculator using both the raw means and the t test results and my results always agree with the higher effect size estimates from Meta-Essentials. What’s strang, though, is that in other data-sets I’ve analyzed wth OpenMeta[Analyst] I get no discrepancy… not sure if there is something special happening here in OpenMeta[Analyst] in terms of pooling variance across all studies to estimate effect size? Any hints?
Thanks,
Bob
The text was updated successfully, but these errors were encountered: