°£Æí°áÁ¦, ½Å¿ëÄ«µå û±¸ÇÒÀÎ
ÀÎÅÍÆÄÅ© ·Ôµ¥Ä«µå 5% (64,600¿ø)
(ÃÖ´ëÇÒÀÎ 10¸¸¿ø / Àü¿ù½ÇÀû 40¸¸¿ø)
ºÏÇǴϾð ·Ôµ¥Ä«µå 30% (47,600¿ø)
(ÃÖ´ëÇÒÀÎ 3¸¸¿ø / 3¸¸¿ø ÀÌ»ó °áÁ¦)
NH¼îÇÎ&ÀÎÅÍÆÄÅ©Ä«µå 20% (54,400¿ø)
(ÃÖ´ëÇÒÀÎ 4¸¸¿ø / 2¸¸¿ø ÀÌ»ó °áÁ¦)
Close

Introduction to Linear Regression Analysis : [¾çÀå]

¼Òµæ°øÁ¦

2013³â 9¿ù 9ÀÏ ÀÌÈÄ ´©Àû¼öÄ¡ÀÔ´Ï´Ù.

°øÀ¯Çϱâ
Á¤°¡

68,000¿ø

  • 68,000¿ø

    2,040P (3%Àû¸³)

ÇÒÀÎÇýÅÃ
Àû¸³ÇýÅÃ
  • S-Point Àû¸³Àº ¸¶ÀÌÆäÀÌÁö¿¡¼­ Á÷Á¢ ±¸¸ÅÈ®Á¤ÇϽŠ°æ¿ì¸¸ Àû¸³ µË´Ï´Ù.
Ãß°¡ÇýÅÃ
¹è¼ÛÁ¤º¸
  • 5/7(È­) À̳» ¹ß¼Û ¿¹Á¤  (¼­¿ï½Ã °­³²±¸ »ï¼º·Î 512)
  • ¹«·á¹è¼Û
ÁÖ¹®¼ö·®
°¨¼Ò Áõ°¡
  • À̺¥Æ®/±âȹÀü

  • ¿¬°üµµ¼­

  • »óÇ°±Ç

AD

¸ñÂ÷

Preface xiii

About the Companion Website xvi

1. Introduction 1

1.1 Regression and Model Building 1

1.2 Data Collection 5

1.3 Uses of Regression 9

1.4 Role of the Computer 10

2. Simple Linear Regression 12

2.1 Simple Linear Regression Model 12

2.2 Least-Squares Estimation of the Parameters 13

2.2.1 Estimation of ¥â0 and ¥â1 13

2.2.2 Properties of the Least-Squares Estimators and the Fitted Regression Model 18

2.2.3 Estimation of ¥ò2 20

2.2.4 Alternate Form of the Model 22

2.3 Hypothesis Testing on the Slope and Intercept 22

2.3.1 Use of t Tests 22

2.3.2 Testing Significance of Regression 24

2.3.3 Analysis of Variance 25

2.4 Interval Estimation in Simple Linear Regression 29

2.4.1 Confidence Intervals on ¥â0, ¥â1, and ¥ò2 29

2.4.2 Interval Estimation of the Mean Response 30

2.5 Prediction of New Observations 33

2.6 Coefficient of Determination 35

2.7 A Service Industry Application of Regression 37

2.8 Does Pitching Win Baseball Games? 39

2.9 Using SAS? and R for Simple Linear Regression 41

2.10 Some Considerations in the Use of Regression 44

2.11 Regression Through the Origin 46

2.12 Estimation by Maximum Likelihood 52

2.13 Case Where the Regressor x is Random 53

2.13.1 x and y Jointly Distributed 54

2.13.2 x and y Jointly Normally Distributed: Correlation Model 54

Problems 59

3. Multiple Linear Regression 69

3.1 Multiple Regression Models 69

3.2 Estimation of the Model Parameters 72

3.2.1 Least-Squares Estimation of the Regression Coefficients 72

3.2.2 Geometrical Interpretation of Least Squares 79

3.2.3 Properties of the Least-Squares Estimators 81

3.2.4 Estimation of ¥ò2 82

3.2.5 Inadequacy of Scatter Diagrams in Multiple Regression 84

3.2.6 Maximum-Likelihood Estimation 85

3.3 Hypothesis Testing in Multiple Linear Regression 86

3.3.1 Test for Significance of Regression 86

3.3.2 Tests on Individual Regression Coefficients and Subsets of Coefficients 90

3.3.3 Special Case of Orthogonal Columns in X 95

3.3.4 Testing the General Linear Hypothesis 97

3.4 Confidence Intervals in Multiple Regression 99

3.4.1 Confidence Intervals on the Regression Coefficients 100

3.4.2 CI Estimation of the Mean Response 101

3.4.3 Simultaneous Confidence Intervals on Regression Coefficients 102

3.5 Prediction of New Observations 106

3.6 A Multiple Regression Model for the Patient Satisfaction Data 106

3.7 Does Pitching and Defense Win Baseball Games? 108

3.8 Using SAS and R for Basic Multiple Linear Regression 110

3.9 Hidden Extrapolation in Multiple Regression 111

3.10 Standardized Regression Coefficients 115

3.11 Multicollinearity 121

3.12 Why Do Regression Coefficients Have the Wrong Sign? 123

Problems 125

4. Model Adequacy Checking 134

4.1 Introduction 134

4.2 Residual Analysis 135

4.2.1 Definition of Residuals 135

4.2.2 Methods for Scaling Residuals 135

4.2.3 Residual Plots 141

4.2.4 Partial Regression and Partial Residual Plots 148

4.2.5 Using Minitab?, SAS, and R for Residual Analysis 151

4.2.6 Other Residual Plotting and Analysis Methods 154

4.3 PRESS Statistic 156

4.4 Detection and Treatment of Outliers 157

4.5 Lack of Fit of the Regression Model 161

4.5.1 A Formal Test for Lack of Fit 161

4.5.2 Estimation of Pure Error from Near Neighbors 165

Problems 170

5. Transformations and Weighting To Correct Model Inadequacies 177

5.1 Introduction 177

5.2 Variance-Stabilizing Transformations 178

5.3 Transformations to Linearize the Model 182

5.4 Analytical Methods for Selecting a Transformation 188

5.4.1 Transformations on y: The Box?Cox Method 188

5.4.2 Transformations on the Regressor Variables 190

5.5 Generalized and Weighted Least Squares 194

5.5.1 Generalized Least Squares 194

5.5.2 Weighted Least Squares 196

5.5.3 Some Practical Issues 197

5.6 Regression Models with Random Effects 200

5.6.1 Subsampling 200

5.6.2 The General Situation for a Regression Model with a Single Random Effect 204

5.6.3 The Importance of the Mixed Model in Regression 208

Problems 208

6. Diagnostics For Leverage and Influence 217

6.1 Importance of Detecting Influential Observations 217

6.2 Leverage 218

6.3 Measures of Influence: Cook¡¯s D 221

6.4 Measures of Influence: DFFITS and DFBETAS 223

6.5 A Measure of Model Performance 225

6.6 Detecting Groups of Influential Observations 226

6.7 Treatment of Influential Observations 226

Problems 227

7. Polynomial Regression Models 230

7.1 Introduction 230

7.2 Polynomial Models in One Variable 230

7.2.1 Basic Principles 230

7.2.2 Piecewise Polynomial Fitting (Splines) 236

7.2.3 Polynomial and Trigonometric Terms 242

7.3 Nonparametric Regression 243

7.3.1 Kernel Regression 244

7.3.2 Locally Weighted Regression (Loess) 244

7.3.3 Final Cautions 249

7.4 Polynomial Models in Two or More Variables 249

7.5 Orthogonal Polynomials 255

Problems 261

8. Indicator Variables 268

8.1 General Concept of Indicator Variables 268

8.2 Comments on the Use of Indicator Variables 281

8.2.1 Indicator Variables versus Regression on Allocated Codes 281

8.2.2 Indicator Variables as a Substitute for a Quantitative Regressor 282

8.3 Regression Approach to Analysis of Variance 283

Problems 288

9. Multicollinearity 293

9.1 Introduction 293

9.2 Sources of Multicollinearity 294

9.3 Effects of Multicollinearity 296

9.4 Multicollinearity Diagnostics 300

9.4.1 Examination of the Correlation Matrix 300

9.4.2 Variance Inflation Factors 304

ÀúÀÚ¼Ò°³

Vining, G. Geoffrey [Àú] ½ÅÀ۾˸² SMS½Åû
»ý³â¿ùÀÏ -

ÇØ´çÀÛ°¡¿¡ ´ëÇÑ ¼Ò°³°¡ ¾ø½À´Ï´Ù.

Peck, Elizabeth A. [Àú] ½ÅÀ۾˸² SMS½Åû
»ý³â¿ùÀÏ -

ÇØ´çÀÛ°¡¿¡ ´ëÇÑ ¼Ò°³°¡ ¾ø½À´Ï´Ù.

Douglas C. Montgomery [Àú] ½ÅÀ۾˸² SMS½Åû
»ý³â¿ùÀÏ -

ÇØ´çÀÛ°¡¿¡ ´ëÇÑ ¼Ò°³°¡ ¾ø½À´Ï´Ù.

´ëÇб³Àç/Àü¹®¼­Àû ºÐ¾ß¿¡¼­ ¸¹Àº ȸ¿øÀÌ ±¸¸ÅÇÑ Ã¥

    ¸®ºä

    0.0 (ÃÑ 0°Ç)

    100ÀÚÆò

    ÀÛ¼º½Ã À¯ÀÇ»çÇ×

    ÆòÁ¡
    0/100ÀÚ
    µî·ÏÇϱâ

    100ÀÚÆò

    0.0
    (ÃÑ 0°Ç)

    ÆǸÅÀÚÁ¤º¸

    • ÀÎÅÍÆÄÅ©µµ¼­¿¡ µî·ÏµÈ ¿ÀǸ¶ÄÏ »óÇ°Àº ±× ³»¿ë°ú Ã¥ÀÓÀÌ ¸ðµÎ ÆǸÅÀÚ¿¡°Ô ÀÖÀ¸¸ç, ÀÎÅÍÆÄÅ©µµ¼­´Â ÇØ´ç »óÇ°°ú ³»¿ë¿¡ ´ëÇØ Ã¥ÀÓÁöÁö ¾Ê½À´Ï´Ù.

    »óÈ£

    (ÁÖ)±³º¸¹®°í

    ´ëÇ¥ÀÚ¸í

    ¾Èº´Çö

    »ç¾÷ÀÚµî·Ï¹øÈ£

    102-81-11670

    ¿¬¶ôó

    1544-1900

    ÀüÀÚ¿ìÆíÁÖ¼Ò

    callcenter@kyobobook.co.kr

    Åë½ÅÆǸž÷½Å°í¹øÈ£

    01-0653

    ¿µ¾÷¼ÒÀçÁö

    ¼­¿ïƯº°½Ã Á¾·Î±¸ Á¾·Î 1(Á¾·Î1°¡,±³º¸ºôµù)

    ±³È¯/ȯºÒ

    ¹ÝÇ°/±³È¯ ¹æ¹ý

    ¡®¸¶ÀÌÆäÀÌÁö > Ãë¼Ò/¹ÝÇ°/±³È¯/ȯºÒ¡¯ ¿¡¼­ ½Åû ¶Ç´Â 1:1 ¹®ÀÇ °Ô½ÃÆÇ ¹× °í°´¼¾ÅÍ(1577-2555)¿¡¼­ ½Åû °¡´É

    ¹ÝÇ°/±³È¯°¡´É ±â°£

    º¯½É ¹ÝÇ°ÀÇ °æ¿ì Ãâ°í¿Ï·á ÈÄ 6ÀÏ(¿µ¾÷ÀÏ ±âÁØ) À̳»±îÁö¸¸ °¡´É
    ´Ü, »óÇ°ÀÇ °áÇÔ ¹× °è¾à³»¿ë°ú ´Ù¸¦ °æ¿ì ¹®Á¦Á¡ ¹ß°ß ÈÄ 30ÀÏ À̳»

    ¹ÝÇ°/±³È¯ ºñ¿ë

    º¯½É ȤÀº ±¸¸ÅÂø¿À·Î ÀÎÇÑ ¹ÝÇ°/±³È¯Àº ¹Ý¼Û·á °í°´ ºÎ´ã
    »óÇ°À̳ª ¼­ºñ½º ÀÚüÀÇ ÇÏÀÚ·Î ÀÎÇÑ ±³È¯/¹ÝÇ°Àº ¹Ý¼Û·á ÆǸÅÀÚ ºÎ´ã

    ¹ÝÇ°/±³È¯ ºÒ°¡ »çÀ¯

    ·¼ÒºñÀÚÀÇ Ã¥ÀÓ ÀÖ´Â »çÀ¯·Î »óÇ° µîÀÌ ¼Õ½Ç ¶Ç´Â ÈÑ¼ÕµÈ °æ¿ì
    (´ÜÁö È®ÀÎÀ» À§ÇÑ Æ÷Àå ÈѼÕÀº Á¦¿Ü)

    ·¼ÒºñÀÚÀÇ »ç¿ë, Æ÷Àå °³ºÀ¿¡ ÀÇÇØ »óÇ° µîÀÇ °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì
    ¿¹) È­ÀåÇ°, ½ÄÇ°, °¡ÀüÁ¦Ç°(¾Ç¼¼¼­¸® Æ÷ÇÔ) µî

    ·º¹Á¦°¡ °¡´ÉÇÑ »óÇ° µîÀÇ Æ÷ÀåÀ» ÈѼÕÇÑ °æ¿ì
    ¿¹) À½¹Ý/DVD/ºñµð¿À, ¼ÒÇÁÆ®¿þ¾î, ¸¸È­Ã¥, ÀâÁö, ¿µ»ó È­º¸Áý

    ·½Ã°£ÀÇ °æ°ú¿¡ ÀÇÇØ ÀçÆǸŰ¡ °ï¶õÇÑ Á¤µµ·Î °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì

    ·ÀüÀÚ»ó°Å·¡ µî¿¡¼­ÀÇ ¼ÒºñÀÚº¸È£¿¡ °üÇÑ ¹ý·üÀÌ Á¤ÇÏ´Â ¼ÒºñÀÚ Ã»¾àöȸ Á¦ÇÑ ³»¿ë¿¡ ÇØ´çµÇ´Â °æ¿ì

    »óÇ° Ç°Àý

    °ø±Þ»ç(ÃâÆÇ»ç) Àç°í »çÁ¤¿¡ ÀÇÇØ Ç°Àý/Áö¿¬µÉ ¼ö ÀÖÀ½

    ¼ÒºñÀÚ ÇÇÇغ¸»ó
    ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó

    ·»óÇ°ÀÇ ºÒ·®¿¡ ÀÇÇÑ ±³È¯, A/S, ȯºÒ, Ç°Áúº¸Áõ ¹× ÇÇÇغ¸»ó µî¿¡ °üÇÑ »çÇ×Àº ¼ÒºñÀÚºÐÀïÇØ°á ±âÁØ (°øÁ¤°Å·¡À§¿øȸ °í½Ã)¿¡ ÁØÇÏ¿© 󸮵Ê

    ·´ë±Ý ȯºÒ ¹× ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó±Ý Áö±Þ Á¶°Ç, ÀýÂ÷ µîÀº ÀüÀÚ»ó°Å·¡ µî¿¡¼­ÀÇ ¼ÒºñÀÚ º¸È£¿¡ °üÇÑ ¹ý·ü¿¡ µû¶ó ó¸®ÇÔ

    (ÁÖ)KGÀ̴Ͻýº ±¸¸Å¾ÈÀü¼­ºñ½º¼­ºñ½º °¡ÀÔ»ç½Ç È®ÀÎ

    (ÁÖ)ÀÎÅÍÆÄÅ©Ä¿¸Ó½º´Â ȸ¿ø´ÔµéÀÇ ¾ÈÀü°Å·¡¸¦ À§ÇØ ±¸¸Å±Ý¾×, °áÁ¦¼ö´Ü¿¡ »ó°ü¾øÀÌ (ÁÖ)ÀÎÅÍÆÄÅ©Ä¿¸Ó½º¸¦ ÅëÇÑ ¸ðµç °Å·¡¿¡ ´ëÇÏ¿©
    (ÁÖ)KGÀ̴Ͻýº°¡ Á¦°øÇÏ´Â ±¸¸Å¾ÈÀü¼­ºñ½º¸¦ Àû¿ëÇÏ°í ÀÖ½À´Ï´Ù.

    ¹è¼Û¾È³»

    • ±³º¸¹®°í »óÇ°Àº Åùè·Î ¹è¼ÛµÇ¸ç, Ãâ°í¿Ï·á 1~2Àϳ» »óÇ°À» ¹Þ¾Æ º¸½Ç ¼ö ÀÖ½À´Ï´Ù.

    • Ãâ°í°¡´É ½Ã°£ÀÌ ¼­·Î ´Ù¸¥ »óÇ°À» ÇÔ²² ÁÖ¹®ÇÒ °æ¿ì Ãâ°í°¡´É ½Ã°£ÀÌ °¡Àå ±ä »óÇ°À» ±âÁØÀ¸·Î ¹è¼ÛµË´Ï´Ù.

    • ±ººÎ´ë, ±³µµ¼Ò µî ƯÁ¤±â°üÀº ¿ìü±¹ Åù踸 ¹è¼Û°¡´ÉÇÕ´Ï´Ù.

    • ¹è¼Ûºñ´Â ¾÷ü ¹è¼Ûºñ Á¤Ã¥¿¡ µû¸¨´Ï´Ù.

    • - µµ¼­ ±¸¸Å ½Ã 15,000¿ø ÀÌ»ó ¹«·á¹è¼Û, 15,000¿ø ¹Ì¸¸ 2,500¿ø - »óÇ°º° ¹è¼Ûºñ°¡ ÀÖ´Â °æ¿ì, »óÇ°º° ¹è¼Ûºñ Á¤Ã¥ Àû¿ë