Normal view MARC view

Applied linear regression models

Author: Neter, John ; Wasserman, William ; Kutner, Michael H.Publisher: Irwin, 1983.Language: EnglishDescription: 547 p. : Graphs ; 24 cm.ISBN: 0256025479Type of document: BookBibliography/Index: Includes index
Tags: No tags from this library for this title. Log in to add tags.
Item type Current location Collection Call number Status Date due Barcode Item holds
Book Europe Campus
Main Collection
Print QA278.2 .N46 1983
(Browse shelf)
000173413
Available 000173413
Total holds: 0

Includes index

Digitized

Applied Linear Regression Models Contents 1. Some basic results in probability and statistics, 1 1.1 Summation and product operators, 1 1.2 Probability, 2 1.3 Random variables, 3 1.4 Normal probability distribution and related distributions, 6 1.5 Statistical estimation, 9 1.6 Inferences about population mean--Normal population, 10 1.7 Comparisons of two population means--Normal populations, 13 1.8 Inferences about population variance--Normal population, 16 1.9 Comparisons of two population variances--Normal populations, 17 Part I Basic regression analysis, 21 2. Linear regression with one independent variable, 23 2.1 Relations between variables, 23 2.2 Regression models and their uses, 26 2.3 Regression model with distribution of error terms unspecified, 31 2.4 Estimation of regression function, 35 2.5 Estimation of error terms variance 2, 46 2.6 Normal error regression model, 48 2.7 Computer inputs and outputs, 51 3. Inferences in regression analysis, 60 3.1 Inferences concerning ß1, 60 3.2 Inferences concerning (ß0, 68 3.3 Some considerations on making inferences concerning ßo and ßi , 70 3.4 Interval estimation of E(Yh), 72 3.5 Prediction of new observation, 76 3.6 Considerations in applying regression analysis, 82 3.7 Case when X is random, 83 3.8 Analysis of variance approach to regression analysis, 84 3.9 3.10 Descriptive measures of association between X and Y in regression model, 96 Computer output, 99 4. Aptness of model and remedial measures, 109 4.1 4.2 4.3 4.4 4.5 4.6 Residuals, 109 Graphic analysis of residuals, 111 Tests involving residuals, 122 F test for lack of fit, 123 Remedial measures, 132 Transformations, 134 5. Simultaneous inferences and other topics in regression analysis-I, 147 5.1 5.2 5.3 5.4 5.5 5.6 5.7 5.8 5.9 Joint estimation of ß0 and ß1, 147 Confidence band for regression line, 154 Simultaneous estimation of mean responses, 157 Simultaneous prediction intervals for new observations, 159 Regression through the origin, 160 Effect of measurement errors, 164 Weighted least squares, 167 Inverse predictions, 172 Choice of X levels, 175 Part II General regression and correlation analysis, 183 6. Matrix approach to simple regression analysis, 185 6.1 6.2 6.3 6.4 Matrices, 185 Matrix addition and subtraction, 190 Matrix multiplication, 192 Special types of matrices, 196 6.5 Linear dependence and rank of matrix, 199 6.6 Inverse of a matrix, 200 6.7 Some basic theorems for matrices, 204 6.8 Random vectors and matrices, 205 6.9 Simple linear regression model in matrix terms, 208 6.10 Least squares estimation of regression parameters, 210 6.11 Analysis of variance results, 212 6.12 Inferences in regression analysis, 216 6.13 Weighted least squares, 219 6.14 Residuals, 220 7. Multiple regression-I, 226 7.1 Multiple regression models, 226 7.2 General linear regression model in matrix terms, 237 7.3 Least squares estimators, 238 7.4 Analysis of variance results, 239 7.5 Inferences about regression parameters, 242 7.6 Inferences about mean response, 244 7.7 Predictions of new observations, 246 7.8 An example-Multiple regression with two independent variables, 247 7.9 Standardized regression coefficients, 261 7.10 Weighted least squares, 263 8. Multiple regression-II, 271 8.1 Multicollinearity and its effects, 271 8.2 Decomposition of SSR into extra sums of squares, 282 8.3 Coefficients of partial determination, 286 8.4 Testing hypotheses concerning regression coefficients in multiple regression, 289 8.5 Matrix formulation of general linear test, 293 9. Polynomial regression, 300 9.1 Polynomial regression models, 300 9.2 Example 1-One independent variable, 305 9.3 Example 2-Two independent variables, 313 9.4 Estimating the maximum or minimum of a quadratic regression function, 317 9.5 Some further comments on polynomial regression, 319 10. Indicator variables, 328 10.1 One independent qualitative variable, 328 10.2 Model containing interaction effects, 335 10.3 More complex models, 339 10.4 Other uses of independent indicator variables, 343 10.5 Some considerations in using independent indicator variables, 351 10.6 Dependent indicator variable, 354 10.7 Linear regression with dependent indicator variable, 357 10.8 Logistic response function, 361 11. Multicollinearity, influential observations, and other topics in regression analysis-II, 377 11.1 Reparameterization to improve computational accuracy, 377 1.2 Problems of multicollinearity, 382 11.3 Variance inflation factors and other methods of detecting multicollinearity, 390 11.4 Ridge regression and other remedial measures for multicollinearity, 393 11.5 Identification of outlying observations, 400 11.6 Identification of influential observations and remedial measures, 407 12. Selection of independent variables, 417 12.1 Nature of problem, 417 12.2 Example, 419 12.3 All possible regression models, 421 12.4 Stepwise regression, 430 12.5 Selection of variables with ridge regression, 436 12.6 Implementation of selection procedures, 436 13. Autocorrelation in time series data, 444 13.1 Problems of autocorrelation, 445 13.2 First-order autoregressive error model, 448 13.3 Durbin-Watson test for autocorrelation, 450 13.4 Remedial measures for autocorrelation, 454 14. Nonlinear regression, 466 14.1 Linear, intrinsically linear, and nonlinear regression models, 466 14.2 Example, 469 14.3 Least squares estimation in nonlinear regression, 470 14.4 Inferences about nonlinear regression parameters, 480 14.5 Learning curve example, 483 15. Normal correlation models, 491 15.1 Distinction between regression and correlation models, 491 15.2 Bivariate normal distribution, 492 15.3 Conditional inferences, 496 15.4 Inferences on p12, 501 15.5 Multivariate normal distribution, 505 Appendix tables, 515 SENIC data set, 533 SMSA data set, 537 Index, 543

There are no comments for this item.

Log in to your account to post a comment.
Koha 18.11 - INSEAD Catalogue
Home | Contact Us | What's Koha?