°£Æí°áÁ¦, ½Å¿ëÄ«µå û±¸ÇÒÀÎ
ÀÎÅÍÆÄÅ© ·Ôµ¥Ä«µå 5% (313,410¿ø)
(ÃÖ´ëÇÒÀÎ 10¸¸¿ø / Àü¿ù½ÇÀû 40¸¸¿ø)
ºÏÇǴϾð ·Ôµ¥Ä«µå 30% (299,900¿ø)
(ÃÖ´ëÇÒÀÎ 3¸¸¿ø / 3¸¸¿ø ÀÌ»ó °áÁ¦)
NH¼îÇÎ&ÀÎÅÍÆÄÅ©Ä«µå 20% (289,900¿ø)
(ÃÖ´ëÇÒÀÎ 4¸¸¿ø / 2¸¸¿ø ÀÌ»ó °áÁ¦)
Close

Forecasting, Time Series, and Regression, 4/e, 4T/E(¾çÀ庻 HardCover)(±âŸ) [¾çÀå]

¼Òµæ°øÁ¦

2013³â 9¿ù 9ÀÏ ÀÌÈÄ ´©Àû¼öÄ¡ÀÔ´Ï´Ù.

ÆǸÅÁö¼ö 27
?
ÆǸÅÁö¼ö¶õ?
»çÀÌÆ®ÀÇ ÆǸŷ®¿¡ ±â¹ÝÇÏ¿© ÆǸŷ® ÃßÀ̸¦ ¹Ý¿µÇÑ ÀÎÅÍÆÄÅ© µµ¼­¿¡¼­ÀÇ µ¶¸³ÀûÀÎ ÆǸŠÁö¼öÀÔ´Ï´Ù. ÇöÀç °¡Àå Àß Æȸ®´Â »óÇ°¿¡ °¡ÁßÄ¡¸¦ µÎ¾ú±â ¶§¹®¿¡ ½ÇÁ¦ ´©Àû ÆǸŷ®°ú´Â ´Ù¼Ò Â÷ÀÌ°¡ ÀÖÀ» ¼ö ÀÖ½À´Ï´Ù. ÆǸŷ® ¿Ü¿¡µµ ´Ù¾çÇÑ °¡ÁßÄ¡·Î ±¸¼ºµÇ¾î ÃÖ±ÙÀÇ À̽´µµ¼­ È®Àνà À¯¿ëÇÒ ¼ö ÀÖ½À´Ï´Ù. ÇØ´ç Áö¼ö´Â ¸ÅÀÏ °»½ÅµË´Ï´Ù.
Close
°øÀ¯Çϱâ
Á¤°¡

329,900¿ø

  • 329,900¿ø

    9,900P (3%Àû¸³)

ÇÒÀÎÇýÅÃ
Àû¸³ÇýÅÃ
  • S-Point Àû¸³Àº ¸¶ÀÌÆäÀÌÁö¿¡¼­ Á÷Á¢ ±¸¸ÅÈ®Á¤ÇϽŠ°æ¿ì¸¸ Àû¸³ µË´Ï´Ù.
Ãß°¡ÇýÅÃ
¹è¼ÛÁ¤º¸
  • 4/26(±Ý) À̳» ¹ß¼Û ¿¹Á¤  (¼­¿ï½Ã °­³²±¸ »ï¼º·Î 512)
  • ¹«·á¹è¼Û
ÁÖ¹®¼ö·®
°¨¼Ò Áõ°¡
  • À̺¥Æ®/±âȹÀü

  • ¿¬°üµµ¼­

  • »óÇ°±Ç

AD

ÃâÆÇ»ç ¼­Æò

Awarded Outstanding Academic Book by CHOICE magazine in its first edition, FORECASTING, TIME SERIES, AND REGRESSION: AN APPLIED APPROACH now appears in a fourth edition that illustrates the vital importance of forecasting and the various statistical techniques that can be used to produce them. With an emphasis on applications, this book provides both the conceptual development and practical motivation students need to effectively implement forecasts of their own. Bruce Bowerman, Richard O'Connell, and Anne Koehler clearly demonstrate the necessity of using forecasts to make intelligent decisions in marketing, finance, personnel management, production scheduling, process control, and strategic management. In addition, new technology coverage makes the latest edition the most applied text available on the market.

¸ñÂ÷

Introduction
An introduction to forecastingp. 1
Basic statistical conceptsp. 27
Regression analysis
Simple linear regressionp. 79
Multiple linear regressionp. 139
Model building and residual analysisp. 221
Time series regression, decomposition methods, and exponential smoothing
Time series regressionp. 279
Decomposition methodsp. 325
Exponential smoothingp. 345
The Box-Jenkins methodology
Nonseasonal Box-Jenkins models and their tentative identificationp. 401
Estimation, diagnostic checking, and forecasting for nonseasonal Box-Jenkins modelsp. 449
Box-Jenkins seasonal modelingp. 489
Advanced Box-Jenkins modelingp. 539
Table of Contents provided by Blackwell. All Rights Reserved.

Ã¥¼Ò°³

Part I: INTRODUCTION AND REVIEW OF BASIC STATISTICS. 1. An Introduction to Forecasting. Forecasting and Data. Forecasting Methods. Errors in Forecasting. Choosing a Forescasting Technique. An Overview of Quantitative Forecasting Techniques. 2. Basic Statistical Concepts. Populations. Probability. Random Samples and Sample Statistics. Continuous Probability Distributions. The Normal Probability Distribution. The t-Distribution, the F-Distribution, the Chi-Square Distribution. Confidence Intervals for a Population Mean. Hypothesis Testing for a Population Mean. Exercises. Part II: REGRESSION ANALYSIS. 3. Simple Linear Regression. The Simple Linear Regression Model. The Least Squares Point Estimates. Point Estimates and Point Predictions. Model Assumptions and the Standard Error. Testing the Significance of the Slope and y Intercept. Confidence and Prediction Intervals. Simple Coefficients of Determination and Correlation. An F Test for the Model. Exercises. 4. Multiple Linear Regression. The Linear Regression Model. The Least Squares Estimates, and Point Estimation and Prediction. The Mean Square Error and the Standard Error. Model Utility: R2, Adjusted R2, and the Overall F Test. Testing the Significance of an Independent Variable. Confidence and Prediction Intervals. The Quadratic Regression Model. Interaction. Using Dummy Variables to Model Qualitative Independent Variables. The Partial F Test: Testing the Significance of a Portion of a Regression Model. Exercises. 5. Model Building and Residual Analysis. Model Building and the Effects of Multicollinearity. Residual Analysis in Simple Regression. Residual Analysis in Multiple Regression. Diagnostics for Detecting Outlying and Influential Observations. Exercises. Part III: TIME SERIES REGRESSION, DECOMPOSITION METHODS, AND EXPONENTIAL SMOOTHING. 6. Time Series Regression. Modeling Trend by Using Polynomial Functions. Detecting Autocorrelation. Types of Seasonal Variation. Modeling Seasonal Variation by Using Dummy Variables and Trigonometric Functions. Growth Curves. Handling First-Order Autocorrelation. Exercises. 7. Decomposition Methods. Multiplicative Decomposition. Additive Decomposition. The X-12-ARIMA Seasonal Adjustment Method. Exercises. 8. Exponential Smoothing. Simple Exponential Smoothing. Tracking Signals. Holt''s Trend Corrected Exponential Smoothing. Holt-Winters Methods. Damped Trends and Other Exponential Smoothing Methods. Models for Exponential Smoothing and Prediction Intervals. Exercises. Part IV: THE BOX-JENK INS METHODOLOGY. 9. Nonseasonal Box-Jenkins Modeling and Their Tentative Identification. Stationary and Nonstationary Time Series. The Sample Autocorrelation and Partial Autocorrelation Functions: The SAC and SPAC. An Introduction to Nonseasonal Modeling and Forecasting. Tentative Identification of Nonseasonal Box-Jenkins Models. Exercises. 10. Estimation, Diagnostic Checking, and Forecasting for Nonseasonal Box-Jenkin s Models. Estimation. Diagnostic Checking. Forecasting. A Case Study. Box-Jenkins Implementation of Exponential Smoothing. Exercises. 11. Box-Jenkins Seasonal Modeling. Transforming a Seasonal Time Series into a Stationary Time Series. Three Examples of Seasonal Modeling and Forecasting. Box-Jenkins Error Term Models in Time Series Regression. Exercises. 12. Advanced Box-Jenkins Modeling. The General Seasonal Model and Guidelines for Tentative Identificatino. Intervention Models. A Procedure for Building a Transfer Function Model. Exercises. Appendix A: Statistical Tables Appendix B: Matrix Algebra for Regression Calculations. Matrices and Vectors. The Transpose of a Matrix. Sums and Differences of Matrices. Matrix Multiplication. The Identity Matrix. Linear Dependence and Linear Independence. The Inverse of a Matrix. The Least Squares Point Esimates. The Unexplained Variation and Explained Variation. The Standard Error of the Estimate b. The Distance Value. Using Squared Terms. Using Interaction Terms. Using Dummy Variable. The Standar

ÀúÀÚ¼Ò°³

Bowerman, Bruce L./ O'Connell, Richard T./ Koehler [Àú] ½ÅÀ۾˸² SMS½Åû
»ý³â¿ùÀÏ -

ÇØ´çÀÛ°¡¿¡ ´ëÇÑ ¼Ò°³°¡ ¾ø½À´Ï´Ù.

°æÁ¦°æ¿µ/Àι®»çȸ ºÐ¾ß¿¡¼­ ¸¹Àº ȸ¿øÀÌ ±¸¸ÅÇÑ Ã¥

    ¸®ºä

    0.0 (ÃÑ 0°Ç)

    100ÀÚÆò

    ÀÛ¼º½Ã À¯ÀÇ»çÇ×

    ÆòÁ¡
    0/100ÀÚ
    µî·ÏÇϱâ

    100ÀÚÆò

    10.0
    (ÃÑ 0°Ç)

    ÆǸÅÀÚÁ¤º¸

    • ÀÎÅÍÆÄÅ©µµ¼­¿¡ µî·ÏµÈ ¿ÀǸ¶ÄÏ »óÇ°Àº ±× ³»¿ë°ú Ã¥ÀÓÀÌ ¸ðµÎ ÆǸÅÀÚ¿¡°Ô ÀÖÀ¸¸ç, ÀÎÅÍÆÄÅ©µµ¼­´Â ÇØ´ç »óÇ°°ú ³»¿ë¿¡ ´ëÇØ Ã¥ÀÓÁöÁö ¾Ê½À´Ï´Ù.

    »óÈ£

    (ÁÖ)±³º¸¹®°í

    ´ëÇ¥ÀÚ¸í

    ¾Èº´Çö

    »ç¾÷ÀÚµî·Ï¹øÈ£

    102-81-11670

    ¿¬¶ôó

    1544-1900

    ÀüÀÚ¿ìÆíÁÖ¼Ò

    callcenter@kyobobook.co.kr

    Åë½ÅÆǸž÷½Å°í¹øÈ£

    01-0653

    ¿µ¾÷¼ÒÀçÁö

    ¼­¿ïƯº°½Ã Á¾·Î±¸ Á¾·Î 1(Á¾·Î1°¡,±³º¸ºôµù)

    ±³È¯/ȯºÒ

    ¹ÝÇ°/±³È¯ ¹æ¹ý

    ¡®¸¶ÀÌÆäÀÌÁö > Ãë¼Ò/¹ÝÇ°/±³È¯/ȯºÒ¡¯ ¿¡¼­ ½Åû ¶Ç´Â 1:1 ¹®ÀÇ °Ô½ÃÆÇ ¹× °í°´¼¾ÅÍ(1577-2555)¿¡¼­ ½Åû °¡´É

    ¹ÝÇ°/±³È¯°¡´É ±â°£

    º¯½É ¹ÝÇ°ÀÇ °æ¿ì Ãâ°í¿Ï·á ÈÄ 6ÀÏ(¿µ¾÷ÀÏ ±âÁØ) À̳»±îÁö¸¸ °¡´É
    ´Ü, »óÇ°ÀÇ °áÇÔ ¹× °è¾à³»¿ë°ú ´Ù¸¦ °æ¿ì ¹®Á¦Á¡ ¹ß°ß ÈÄ 30ÀÏ À̳»

    ¹ÝÇ°/±³È¯ ºñ¿ë

    º¯½É ȤÀº ±¸¸ÅÂø¿À·Î ÀÎÇÑ ¹ÝÇ°/±³È¯Àº ¹Ý¼Û·á °í°´ ºÎ´ã
    »óÇ°À̳ª ¼­ºñ½º ÀÚüÀÇ ÇÏÀÚ·Î ÀÎÇÑ ±³È¯/¹ÝÇ°Àº ¹Ý¼Û·á ÆǸÅÀÚ ºÎ´ã

    ¹ÝÇ°/±³È¯ ºÒ°¡ »çÀ¯

    ·¼ÒºñÀÚÀÇ Ã¥ÀÓ ÀÖ´Â »çÀ¯·Î »óÇ° µîÀÌ ¼Õ½Ç ¶Ç´Â ÈÑ¼ÕµÈ °æ¿ì
    (´ÜÁö È®ÀÎÀ» À§ÇÑ Æ÷Àå ÈѼÕÀº Á¦¿Ü)

    ·¼ÒºñÀÚÀÇ »ç¿ë, Æ÷Àå °³ºÀ¿¡ ÀÇÇØ »óÇ° µîÀÇ °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì
    ¿¹) È­ÀåÇ°, ½ÄÇ°, °¡ÀüÁ¦Ç°(¾Ç¼¼¼­¸® Æ÷ÇÔ) µî

    ·º¹Á¦°¡ °¡´ÉÇÑ »óÇ° µîÀÇ Æ÷ÀåÀ» ÈѼÕÇÑ °æ¿ì
    ¿¹) À½¹Ý/DVD/ºñµð¿À, ¼ÒÇÁÆ®¿þ¾î, ¸¸È­Ã¥, ÀâÁö, ¿µ»ó È­º¸Áý

    ·½Ã°£ÀÇ °æ°ú¿¡ ÀÇÇØ ÀçÆǸŰ¡ °ï¶õÇÑ Á¤µµ·Î °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì

    ·ÀüÀÚ»ó°Å·¡ µî¿¡¼­ÀÇ ¼ÒºñÀÚº¸È£¿¡ °üÇÑ ¹ý·üÀÌ Á¤ÇÏ´Â ¼ÒºñÀÚ Ã»¾àöȸ Á¦ÇÑ ³»¿ë¿¡ ÇØ´çµÇ´Â °æ¿ì

    »óÇ° Ç°Àý

    °ø±Þ»ç(ÃâÆÇ»ç) Àç°í »çÁ¤¿¡ ÀÇÇØ Ç°Àý/Áö¿¬µÉ ¼ö ÀÖÀ½

    ¼ÒºñÀÚ ÇÇÇغ¸»ó
    ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó

    ·»óÇ°ÀÇ ºÒ·®¿¡ ÀÇÇÑ ±³È¯, A/S, ȯºÒ, Ç°Áúº¸Áõ ¹× ÇÇÇغ¸»ó µî¿¡ °üÇÑ »çÇ×Àº ¼ÒºñÀÚºÐÀïÇØ°á ±âÁØ (°øÁ¤°Å·¡À§¿øȸ °í½Ã)¿¡ ÁØÇÏ¿© 󸮵Ê

    ·´ë±Ý ȯºÒ ¹× ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó±Ý Áö±Þ Á¶°Ç, ÀýÂ÷ µîÀº ÀüÀÚ»ó°Å·¡ µî¿¡¼­ÀÇ ¼ÒºñÀÚ º¸È£¿¡ °üÇÑ ¹ý·ü¿¡ µû¶ó ó¸®ÇÔ

    (ÁÖ)KGÀ̴Ͻýº ±¸¸Å¾ÈÀü¼­ºñ½º¼­ºñ½º °¡ÀÔ»ç½Ç È®ÀÎ

    (ÁÖ)ÀÎÅÍÆÄÅ©Ä¿¸Ó½º´Â ȸ¿ø´ÔµéÀÇ ¾ÈÀü°Å·¡¸¦ À§ÇØ ±¸¸Å±Ý¾×, °áÁ¦¼ö´Ü¿¡ »ó°ü¾øÀÌ (ÁÖ)ÀÎÅÍÆÄÅ©Ä¿¸Ó½º¸¦ ÅëÇÑ ¸ðµç °Å·¡¿¡ ´ëÇÏ¿©
    (ÁÖ)KGÀ̴Ͻýº°¡ Á¦°øÇÏ´Â ±¸¸Å¾ÈÀü¼­ºñ½º¸¦ Àû¿ëÇÏ°í ÀÖ½À´Ï´Ù.

    ¹è¼Û¾È³»

    • ±³º¸¹®°í »óÇ°Àº Åùè·Î ¹è¼ÛµÇ¸ç, Ãâ°í¿Ï·á 1~2Àϳ» »óÇ°À» ¹Þ¾Æ º¸½Ç ¼ö ÀÖ½À´Ï´Ù.

    • Ãâ°í°¡´É ½Ã°£ÀÌ ¼­·Î ´Ù¸¥ »óÇ°À» ÇÔ²² ÁÖ¹®ÇÒ °æ¿ì Ãâ°í°¡´É ½Ã°£ÀÌ °¡Àå ±ä »óÇ°À» ±âÁØÀ¸·Î ¹è¼ÛµË´Ï´Ù.

    • ±ººÎ´ë, ±³µµ¼Ò µî ƯÁ¤±â°üÀº ¿ìü±¹ Åù踸 ¹è¼Û°¡´ÉÇÕ´Ï´Ù.

    • ¹è¼Ûºñ´Â ¾÷ü ¹è¼Ûºñ Á¤Ã¥¿¡ µû¸¨´Ï´Ù.

    • - µµ¼­ ±¸¸Å ½Ã 15,000¿ø ÀÌ»ó ¹«·á¹è¼Û, 15,000¿ø ¹Ì¸¸ 2,500¿ø - »óÇ°º° ¹è¼Ûºñ°¡ ÀÖ´Â °æ¿ì, »óÇ°º° ¹è¼Ûºñ Á¤Ã¥ Àû¿ë