Stats F-Test - Statistics: Lecture Notes PDF

Title Stats F-Test - Statistics: Lecture Notes
Course Applied Statistics I
Institution Texas Tech University
Pages 2
File Size 64.1 KB
File Type PDF
Total Downloads 66
Total Views 124

Summary

Statistics: Lecture Notes...


Description

Stats: F-Test

Definitions F-distribution - The ratio of two independent chi-square variables divided by their respective degrees of freedom. ●

If the population variances are equal, this simplifies to be the ratio of the sample variances.

Analysis of Variance (ANOVA) - A technique used to test a hypothesis concerning the means of three or more populations. One-Way Analysis of Variance - Analysis of Variance when there is only one independent variable. ●

The null hypothesis will be that all population means are equal, the alternative hypothesis is that at least one mean is different.

Between Group Variation - The variation due to the interaction between the samples, denoted SS(B) for Sum of Squares Between groups. ● ●

If the sample means are close to each other (and therefore the Grand Mean) this will be small. There are k samples involved with one data value for each sample (the sample mean), so there are k-1 degrees of freedom.

Between Group Variance - The variance due to the interaction between the samples, denoted MS(B) for Mean Square Between groups. ●

This is the between group variation divided by its degrees of freedom.

Within Group Variation - The variation due to differences within individual samples, denoted SS(W) for Sum of Squares Within groups. ● ● ●

Each sample is considered independently, no interaction between samples is involved. The degrees of freedom is equal to the sum of the individual degrees of freedom for each sample. Since each sample has degrees of freedom equal to one less than their sample sizes, and there are k samples, the total degrees of freedom is k less than the total sample size: df = N - k.

Within Group Variance - The variance due to the differences within individual samples, denoted MS(W) for Mean Square Within groups.



This is the within group variation divided by its degrees of freedom.

Scheffe' Test - A test used to find where the differences between means lie when the Analysis of Variance indicates the means are not all equal. ●

The Scheffe' test is generally used when the sample sizes are different.

Tukey Test - A test used to find where the differences between the means lie when the Analysis of Variance indicates the means are not all equal. ●

The Tukey test is generally used when the sample sizes are all the same.

Two-Way Analysis of Variance - An extension to the one-way analysis of variance. ● ● ● ● ●

There are two independent variables. There are three sets of hypotheses with the two-way ANOVA. The first null hypothesis is that there is no interaction between the two factors. The second null hypothesis is that the population means of the first factor are equal. The third null hypothesis is that the population means of the second factor are equal.

Factors - The two independent variables in a two-way ANOVA. Treatment Groups - Groups formed by making all possible combinations of the two factors. ●

For example, if the first factor has 3 levels and the second factor has 2 levels, then there will be 3x2=6 different treatment groups.

Interaction Effect - The effect one factor has on the other factor Main Effect - The effects of the independent variables....


Similar Free PDFs