HOME > 상세정보

상세정보

Regression, a second course in statistics

Regression, a second course in statistics (6회 대출)

자료유형
단행본
개인저자
Wonnacott, Thomas H., 1935- Wonnacott, Ronald J., joint author.
서명 / 저자사항
Regression, a second course in statistics / Thomas H. Wonnacott, Ronald J. Wonnacott.
발행사항
New York :   Wiley,   c1981.  
형태사항
xix, 556 p. : ill. ; 24 cm.
총서사항
Wiley series in probability and mathematical statistics.
ISBN
047195974X :
일반주기
Includes index.  
서지주기
Bibliography: p. 545-548.
일반주제명
Regression analysis.
000 00864camuuu200265 i 4500
001 000000062829
005 19980730141440.0
008 800115s1981 nyua b 00110 eng
010 ▼a 80000271
020 ▼a 047195974X : ▼c $19.95 (est.)
040 ▼a 211009 ▼c 211009
049 1 ▼l 421032799 ▼f 과학 ▼l 421035608 ▼f 과학
050 0 ▼a QA278.2 ▼b .W66
082 0 4 ▼a 519.5/36
090 ▼a 519.536 ▼b W872r
100 1 0 ▼a Wonnacott, Thomas H., ▼d 1935-
245 1 0 ▼a Regression, a second course in statistics / ▼c Thomas H. Wonnacott, Ronald J. Wonnacott.
260 0 ▼a New York : ▼b Wiley, ▼c c1981.
300 ▼a xix, 556 p. : ▼b ill. ; ▼c 24 cm.
490 0 ▼a Wiley series in probability and mathematical statistics.
500 ▼a Includes index.
504 ▼a Bibliography: p. 545-548.
650 0 ▼a Regression analysis.
700 1 0 ▼a Wonnacott, Ronald J., ▼e joint author.

No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 중앙도서관/교육보존(보건)/ 청구기호 519.536 W872r 등록번호 141078509 도서상태 대출가능 반납예정일 예약 서비스 B M
No. 2 소장처 과학도서관/Sci-Info(2층서고)/ 청구기호 519.536 W872r 등록번호 421032799 도서상태 대출가능 반납예정일 예약 서비스 B M
No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 중앙도서관/교육보존(보건)/ 청구기호 519.536 W872r 등록번호 141078509 도서상태 대출가능 반납예정일 예약 서비스 B M
No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 과학도서관/Sci-Info(2층서고)/ 청구기호 519.536 W872r 등록번호 421032799 도서상태 대출가능 반납예정일 예약 서비스 B M

컨텐츠정보

목차

CONTENTS
1. Introduction = 1
  1-1. Randomized Experiments = 1
  1-2. Randomized Experiments in the Social Sciences = 6
  1-3. Regression = 8
  1-4. Brief Outline of the Book = 11
2. Simple Regression = 13
  2-1. An Example = 13
  2-2. Possible Criteria for Fitting a Line = 15
  2-3. The Least Squares Solution = 18
  2-4. The Mathematical Regression Model = 25
  2-5. The Mean and Variance of and = 29
  2-6. The Gauss-Markov Theorem = 31
  2-7. The Dsitribution of = 33
  2-8. Confidence Intervals and Tests for β = 35
  2-9. Interpolation = 42
  2-10. Dangers of Extrapolation = 48
  2-11. Least Squares When X Is Random = 49
  2-12. Maximum Likelihood Estimation(MLE) = 50
    Appendices
    2-A. Linear Transformations = 59
    2-B. Desirable Properties of Estimators = 59
    2-C. Proof of the Gauss-Markov Theorem = 71
    2-D. Maximum Likelihood Estimate of σ2 = 73
3. Multiple Regression = 75
  3-1. Introduction = 75
  3-2. The Mathematical Model = 77
  3-3. Least Squares(Maximum Likelihood) Estimation = 79
  3-4. Multicollinearity = 84
  3-5. Confidence Intervals and Statistical Tests = 88
  3-6. How Many Regressors Should Be Retained? = 92
  3-7. Prob-Value = 94
  3-8. Simple and Multiple Regression Compared = 99
4. Multiple Regression Extensions = 104
  4-1. Dummy(0-1) Variables = 104
  4-2. Analysis of Variance(ANOVA) = 115
  4-3. Simplest Nonlinear Regression = 120
  4-4. Nonlinearities Requiring a Transformation = 124
  4-5. Logits to Refine a 0-1 Response = 135
  4-6. Intractable Nonlinearity = 138
  4-7. The Physical and Social Sciences Contrasted = 140
    Appendices
    4-A. MLE for the Logit Model = 147
    4-B. Intractable Nonlinear Regressions Solved by Successive Linear Approximation = 148
5. Correlation = 152
  5-1. Simple Correlation = 152
  5-2. Correlation and Regression = 163
  5-3. partial and Multiple Correlation = 179
  5-4. Path Analysis = 194
6. Time Series = 208
  A. Changing Variance in the Error = 208
    6-1. Heteroscedasticity = 208
  B. Simple Time Series Decomposition and Forecasting = 213
    6-2. The Components of a Time Series = 213
    6-3. Trend = 214
    6-4. Sessonal = 215
    6-5. Random Tracking(Autocorrelated) Error = 220
    6-6. Forecasting = 222
  C. Serially Correlated Error and Lagged Variables = 226
    6-7. Serial Correlation in the Error = 226
    6-8. Lagged X Variables = 238
    6-9. Serial Correlation in the Dependent Variable = 244
    6-10. Serial Correlation in Both the Error and the Dependent Variable = 245
  D. Box Jenkins Methods = 248
    6-11. ARIMA Models = 248
    6-12. Estimation and Forecasting = 260
  E. Spectral Analysis = 265
    6-13. Cycles = 265
    6-14. Spectral Analysis = 268
    6-15. Cross-Spectral Analysis = 274
7. Simultaneous Equations, and Other Examples of Correlated Regressor and Error = 278
  7-1. A New Look at OLS = 278
  7-2. Inconsistency of OLS When e and X Correlated = 281
  7-3. IV Extended to Multiple Regression = 283
  7-4. Simultaneous Equations-The Consumption Function = 284
  7-5. Errors in Both Variables = 293
8. The Identification Problem = 301
  8-1. Unidentified Equations = 301
  8-2. Identification Using Prior Information = 304
  8-3. Identification Using Prior Information About Exogenous Variables = 307
  8-4. Requirement for Identification, In General = 309
  8-5. Overidentification = 314
  8-6. Summary : Identification in Context = 317
9. Selected Estimating Techniques = 319
  9-1. Two-Stage Least Squares(2SLS) = 319
  9-2. Other Procedures = 321
  9-3. Recursive Systems = 323
10. Bayesian Inference = 327
  10-1. Posterior Probabilities in General = 327
  10-2. Population Proportion π = 332
  10-3. Mean μ of a Normal Population = 339
  10-4. Bayesian Regression = 347
11. Analysis of Variance(ANOVA) = 352
  11-1. One-Factor ANOVA = 352
  11-2. Two-Factor ANOVA, without Interaction = 369
  11-3. Two-Factor ANOVA, with Interaction = 376
  11-4. Random Effects Models = 394
  11-5. Bayes Adjustments to ANOVA = 399
  11-6. Bayes Adjustments for Simple Regression = 407
    Appendices
    11-A. Proof of the Bayes ANOVA Estimate = 409
    11-B. Proof of the Bayes Regression Estimate = 411
REGRESSION USING MATRICES = 413
  Introduction = 415
12. Multiple Regression Using Matrices = 417
  12-1. Introduction to the General Linear Model = 417
  12-2. Least Squares Estimation(OLS) = 420
  12-3. Maximum Likelihood Estimation(MLE) = 421
  12-4. Distribution of = 424
  12-5. Confidence Regions and Hypothesis Testing = 426
  12-6. Multicollinearity = 438
  12-7. Interpolation and Prediction = 442
    Appendix
    12-A. Partial Derivatives of Linear and Quadratic Forms = 447
13. Distribution Theory : How the Normal, t, x2 , and F Distributions Are Related = 449
  13-1. Introduction = 449
  13-2. x2 , The Chi-Square Distribution = 449
  13-3. t Distribution = 456
  13-4. The F Distribution = 459
  13-5. Comparison and Review = 461
14. Vector Geometry = 464
  14-1. The Geometric Interpretation of Vectors = 464
  14-2. Least Squares Fit = 479
  14-3. Orthogonal Regressors = 482
  14-4. ANOVA for Simple Regression = 483
  14-5. The Statistical Model = 484
  14-6. Multicollinearity = 486
  14-7. Correlation and Cos θ = 487
  14-8. Correlation-Simple, Multiple, and partial = 488
  14-9. Tests When There are k Regressors = 491
  14-10. Foreward Stepwise Regression = 495
15. Other Regression Topics = 501
  15-1. Specification Error = 501
  15-2. Principal Components = 507
Appendix Tables = 515
Answers to Odd-Numbered Problems = 535
Bibliography = 545
Index = 549


관련분야 신착자료