Standard Error When Using Deviation Coding in Linear Regression

18 Views Asked by At

I am using deviation coding (sometimes called Deviation from the Mean coding or ANOVA coding or Deviation Effect coding) in my linear model in R. Rather than compare to a reference group, this method compares the mean of the dependent variable for a given level to the grand mean of the dependent variable.

I have a categorical variable with 7 levels. The coding scheme looks like this:

                           [,1] [,2] [,3] [,4] [,5] [,6]
group1                      1    0    0    0    0    0
group2                      0    1    0    0    0    0
group3                      0    0    1    0    0    0
group4                      0    0    0    1    0    0
group5                      0    0    0    0    1    0
group6                      0    0    0    0    0    1
group7                     -1   -1   -1   -1   -1   -1

The output gives the difference between each group's mean and the overall mean (the regression coefficient, beta) and the standard error, among other statistics.

To find the beta for the final group (group7), I take the negative sum of all the other betas.

This all makes sense to me. But here is my question:

How do I calculate the standard error of the beta for the final group (group7)? I have searched the internet and textbooks and cannot find the answer anywhere. Any help would be greatly appreciated.

Thanks in advance.

0

There are 0 best solutions below