anova(lm.obj, ...)
summary(lm.obj, correlation = FALSE)
predict(lm.obj,newdata = model.frame(object), conf.level = 0.95,
tol.level = conf.level)
coefficients(lm.obj)
deviance(lm.obj)
df.residual(lm.obj)
effects(lm.obj)
fitted.values(lm.obj)
residuals(lm.obj)
weights(lm.obj)
print.summary(summary.lm.obj, digits = max(3, .Options$digits - 3),
symbolic.cor = p > 4, signif.stars= TRUE, ...)
lm.obj
|
an object of class lm, usually, a result of
lm(..).
|
methods for class lm or
summary.lm and anova.lm objects.
print.summary.lm tries to be smart about formatting the
coefficients, standard errors, etc. and additionally gives ``significance
stars'' if signif.stars= TRUE.
summary.lm computes and returns a list of summary
statistics of the fitted linear model given in lm.obj, using
the slots (list elements) "call", "terms", and
"residuals" from its argument, plus
coefficients
| a p x 4 matrix with columns for the estimated coefficient, its standard error, t-statistic and corresponding (two-sided) p-value. |
sigma
|
the square root of the estimated variance of the random
error,
sigma^2 = 1/(n-p) Sum(R[i]^2) , where R[i] is the i-th residual,residuals[i].
|
df
| degrees of freedom, a 3-vector (p,n-p,p*) ... |
fstatistic
| a 3-vector with the value of the F-statistic with its numerator and denominator degrees of freedom. |
r.squared
|
R^2, the ``fraction of variance explained by
the model'',
R^2 = 1 - Sum(R[i]^2) / Sum((y[i]- y*)^2), where y* is the mean of y[i] if there is an intercept and zero otherwise. |
adj.r.squared
| the above R^2 statistic ``adjusted'', penalizing for higher p. |
cov.unscaled
| a p x p matrix of (unscaled) covariances of the coef..[j], j=1,...,p. |
correlation
|
the correlation matrix corresponding to the above
cov.unscaled, if correlation = TRUE is specified.
|
The function anova produces an analysis of variance
(anova) table. The generic accessor functions
coefficients, effects, fitted.values and
residuals can be used to extract various useful features of the
value returned by lm.
lm.
anova for the ANOVA table,
coefficients, effects, fitted.values,
glm for generalized linear models,
lm.influence for regression diagnostics,
residuals, summary.
##-- Continuing the lm(.) example: coef(lm.D90)# the bare coefficients sld90 <- summary(lm.D90 <- lm(weight ~ group -1))# omitting intercept sld90 coef(sld90)# much more ## The 2 basic regression diagnostic plots: plot(resid(lm.D90), fitted(lm.D90))# Tukey-Anscombe's abline(h=0, lty=2, col = 'gray') qqnorm(residuals(lm.D90)) ## Predictions x<-rnorm(15) y<-x+rnorm(15) predict(lm(y~x)) predict(lm(y~x),data.frame(x=seq(-3,3,0.1)),.99,.90)