Re: Presenting ANOVA Results



Hi Graham,

Looks good. No need to report MS(Between) -- it can be derived if one knows
MS(Within/Error) and the F score. Indeed, this highlights why the ANOVA
results are reported they way they are: Not everything has to be reported,
because many things can be derived (e.g., effect sizes) as long as the right
"base" information is available (e.g., means and variances or SDs).

Regarding the p value, it technically isn't 0, but some very, very small value
-- beyond the number of significant digits Excel is using. So, p < .001 is
appropriate, or you could even say p < .0001 or p < .00001 to highlight just
how small of a p value was observed.

Bruce makes a good point re: using the word "reliable" -- technically
speaking, the reliability of an estimate has a different, specific meaning. I
agree with Ray's point as well re: reporting SDs is more immediately
interpretable and meaningful. However, I wouldn't recommend NOT reporting
MSE, for the very reason discussed above: a reader can derive the entire ANOVA
table if the "right" base information is provided. Rather, if desired, I'd
say report both (i.e., SD with your means, MSE with the ANOVA results).

Doug


On Wed, 16 Jan 2008 18:07:27 EST, Graham Ashe <knight_armour@xxxxxxxxx> wrote:
Doug,

Okay, so the corrected presentation of my results looks like this.

"The differences between the mean scores were statistically significant; F (2, 2997) = 1666, p < 0.001, MSE = 0.0017 using an analysis of variance, single factor test."

Did I leave anything out? Doesn't the MSB (2.778) also need to be reported? Also, according to Excel, my P value is "0". Should I be reporting it as <0.001?


Ray,

Thanks for the tip on reporting SD with my means. I calculated the SD independenty for each group and put it something like this.

Mean = 0.252 (SD 0.021)


.