Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Fit Analyses

Statistical Models

The relationship between a response variable and a set of explanatory variables can be studied through a regression model
y_{i} = f( x_{i}) +
 \varepsilon_{i}
where yi is the ith observed response value, xi is the ith vector of explanatory values, and \varepsilon_{i}'s are uncorrelated random variables with zero mean and a common variance.

If the form of the regression function f is known except for certain parameters, the model is called a parametric regression model. Furthermore, if the regression function is linear in the unknown parameters, the model is called a linear model. In the case of linear models with the error term \varepsilon_{i} assumed to be normally distributed, you can use classical linear models to explore the relationship between the response variable and the explanatory variables. A nonparametric model generally assumes only that f belongs to some infinite- dimensional collection of functions. For example, f may be assumed to be differentiable with a square-integrable second derivative. When there is only one explanatory X variable, you can use nonparametric smoothing methods, such as smoothing splines, kernel estimators, and local polynomial smoothers. You can also request confidence ellipses and parametric fits (mean, linear regression, and polynomial curves) with a linear model. These are added to a scatter plot generated from Y by a single X and are described in the "Fit Curves" section. When there are two explanatory variables in the model, you can create parametric and nonparametric (kernel and thin-plate smoothing spline) response surface plots. With more than two explanatory variables in the model, a parametric profile response surface plot with two selected explanatory variables can be created.

When the response yi has a distribution from the exponential family (normal, inverse Gaussian, gamma, Poisson, binomial), and the mean { \mu_{i}} of the response variable yi is assumed to be related to a linear predictor through a monotone function g

g( \mu_{i}) = {x_{i}'} {\beta}
where \beta is a vector of unknown parameters, you can explore the relationship by using generalized linear models.

Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Top
Top

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.