Re: Presenting ANOVA Results
- From: Doug Morse <morse@xxxxxxxxx>
- Date: Thu, 17 Jan 2008 07:50:49 +0000 (UTC)
Looks good. No need to report MS(Between) -- it can be derived if one knows
MS(Within/Error) and the F score. Indeed, this highlights why the ANOVA
results are reported they way they are: Not everything has to be reported,
because many things can be derived (e.g., effect sizes) as long as the right
"base" information is available (e.g., means and variances or SDs).
Regarding the p value, it technically isn't 0, but some very, very small value
-- beyond the number of significant digits Excel is using. So, p < .001 is
appropriate, or you could even say p < .0001 or p < .00001 to highlight just
how small of a p value was observed.
Bruce makes a good point re: using the word "reliable" -- technically
speaking, the reliability of an estimate has a different, specific meaning. I
agree with Ray's point as well re: reporting SDs is more immediately
interpretable and meaningful. However, I wouldn't recommend NOT reporting
MSE, for the very reason discussed above: a reader can derive the entire ANOVA
table if the "right" base information is provided. Rather, if desired, I'd
say report both (i.e., SD with your means, MSE with the ANOVA results).
On Wed, 16 Jan 2008 18:07:27 EST, Graham Ashe <knight_armour@xxxxxxxxx> wrote:
Okay, so the corrected presentation of my results looks like this.
"The differences between the mean scores were statistically significant; F (2, 2997) = 1666, p < 0.001, MSE = 0.0017 using an analysis of variance, single factor test."
Did I leave anything out? Doesn't the MSB (2.778) also need to be reported? Also, according to Excel, my P value is "0". Should I be reporting it as <0.001?
Thanks for the tip on reporting SD with my means. I calculated the SD independenty for each group and put it something like this.
Mean = 0.252 (SD 0.021)