°£Æí°áÁ¦, ½Å¿ëÄ«µå û±¸ÇÒÀÎ
ÀÎÅÍÆÄÅ© ·Ôµ¥Ä«µå 5% (42,750¿ø)
(ÃÖ´ëÇÒÀÎ 10¸¸¿ø / Àü¿ù½ÇÀû 40¸¸¿ø)
ºÏÇǴϾð ·Ôµ¥Ä«µå 30% (31,500¿ø)
(ÃÖ´ëÇÒÀÎ 3¸¸¿ø / 3¸¸¿ø ÀÌ»ó °áÁ¦)
NH¼îÇÎ&ÀÎÅÍÆÄÅ©Ä«µå 20% (36,000¿ø)
(ÃÖ´ëÇÒÀÎ 4¸¸¿ø / 2¸¸¿ø ÀÌ»ó °áÁ¦)
Close

Probability and Statistics for Engineers and Scientists,walpole et al

¼Òµæ°øÁ¦

2013³â 9¿ù 9ÀÏ ÀÌÈÄ ´©Àû¼öÄ¡ÀÔ´Ï´Ù.

ÆǸÅÁö¼ö 22
?
ÆǸÅÁö¼ö¶õ?
»çÀÌÆ®ÀÇ ÆǸŷ®¿¡ ±â¹ÝÇÏ¿© ÆǸŷ® ÃßÀ̸¦ ¹Ý¿µÇÑ ÀÎÅÍÆÄÅ© µµ¼­¿¡¼­ÀÇ µ¶¸³ÀûÀÎ ÆǸŠÁö¼öÀÔ´Ï´Ù. ÇöÀç °¡Àå Àß Æȸ®´Â »óÇ°¿¡ °¡ÁßÄ¡¸¦ µÎ¾ú±â ¶§¹®¿¡ ½ÇÁ¦ ´©Àû ÆǸŷ®°ú´Â ´Ù¼Ò Â÷ÀÌ°¡ ÀÖÀ» ¼ö ÀÖ½À´Ï´Ù. ÆǸŷ® ¿Ü¿¡µµ ´Ù¾çÇÑ °¡ÁßÄ¡·Î ±¸¼ºµÇ¾î ÃÖ±ÙÀÇ À̽´µµ¼­ È®Àνà À¯¿ëÇÒ ¼ö ÀÖ½À´Ï´Ù. ÇØ´ç Áö¼ö´Â ¸ÅÀÏ °»½ÅµË´Ï´Ù.
Close
°øÀ¯Çϱâ
  • ÃâÆÇ»ç : Pearson Education
  • ¹ßÇà : 2014³â 02¿ù 25ÀÏ
  • Âʼö : 0
  • ISBN : 9781292023922
Á¤°¡

45,000¿ø

  • 45,000¿ø

    1,350P (3%Àû¸³)

ÇÒÀÎÇýÅÃ
Àû¸³ÇýÅÃ
  • S-Point Àû¸³Àº ¸¶ÀÌÆäÀÌÁö¿¡¼­ Á÷Á¢ ±¸¸ÅÈ®Á¤ÇϽŠ°æ¿ì¸¸ Àû¸³ µË´Ï´Ù.
Ãß°¡ÇýÅÃ
¹è¼ÛÁ¤º¸
  • 5/2(¸ñ) À̳» ¹ß¼Û ¿¹Á¤  (¼­¿ï½Ã °­³²±¸ »ï¼º·Î 512)
  • ¹«·á¹è¼Û
ÁÖ¹®¼ö·®
°¨¼Ò Áõ°¡
  • À̺¥Æ®/±âȹÀü

  • ¿¬°üµµ¼­

  • »óÇ°±Ç

AD

ÃâÆÇ»ç ¼­Æò

The balance between theory and applications offers mathematical support to enhance coverage when necessary, giving engineers and scientists the proper mathematical context for statistical tools and methods.
Mathematical level: this text assumes one semester of differential and integral calculus as a prerequisite.
Calculus is confined to elementary probability theory and probability distributions (Chapters 2?7).
Matrix algebra is used modestly in coverage of linear regression material (Chapters 11?12).
Linear algebra and the use of matrices are applied in Chapters 11?15, where treatment of linear regression and analysis of variance is covered.
Compelling exercise sets challenge students to use the concepts to solve problems that occur in many real-life scientific and engineering situations. Many exercises contain real data from studies in the fields of biomedical, bioengineering, business, computing, etc.
Real-life applications of the Poisson, binomial, and hypergeometric distributions generate student interest using topics such as flaws in manufactured copper wire, highway potholes, hospital patient traffic, airport luggage screening, and homeland security.
Statistical software coverage in the following case studies includes SAS¢ç and MINITAB¢ç, with screenshots and graphics as appropriate:
Two-sample hypothesis testing
Multiple linear regression
Analysis of variance
Use of two-level factorial-experiments
Interaction plots provide examples of scientific interpretations and new exercises using graphics.
Topic outline
Chapter 1: elementary overview of statistical inference
Chapters 2-4: basic probability; discrete and continuous random variables
Chapters 2-10: probability distributions and statistical inferences
Chapters 5-6: specific discrete and continuous distributions with illustrations of their use and relationships among them
Chapter 7: optional chapter covering the transformation of random variables.
Chapter 8: additional materials on graphical methods; an important introduction to the notion of sampling distribution
Chapters 9?10: one and two sample point and interval estimation
Chapters 11-15: linear regression; analysis of variance

¸ñÂ÷

Preface
1. Introduction to Statistics and Data Analysis
1.1 Overview: Statistical Inference, Samples, Populations, and the Role of Probability
1.2 Sampling Procedures; Collection of Data
1.3 Measures of Location: The Sample Mean and Median
1.4 Measures of Variability
1.5 Discrete and Continuous Data
1.6 Statistical Modeling, Scientific Inspection, and Graphical Methods 19
1.7 General Types of Statistical Studies: Designed Experiment,

2. Probability
2.1 Sample Space
2.2 Events
2.3 Counting Sample Points
2.4 Probability of an Event
2.5 Additive Rules
2.6 Conditional Probability, Independence and Product Rules
2.7 Bayes¡¯ Rule
2.8 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

3. Random Variables and Probability Distributions
3.1 Concept of a Random Variable
3.2 Discrete Probability Distributions
3.3 Continuous Probability Distributions
3.4 Joint Probability Distributions
3.5 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

4. Mathematical Expectation
4.1 Mean of a Random Variable
4.2 Variance and Covariance of Random Variables
4.3 Means and Variances of Linear Combinations of Random Variables 127
4.4 Chebyshev¡¯s Theorem
4.5 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

5. Some Discrete Probability Distributions
5.1 Introduction and Motivation
5.2 Binomial and Multinomial Distributions
5.3 Hypergeometric Distribution
5.4 Negative Binomial and Geometric Distributions
5.5 Poisson Distribution and the Poisson Process
5.6 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

6. Some Continuous Probability Distributions
6.1 Continuous Uniform Distribution
6.2 Normal Distribution
6.3 Areas under the Normal Curve
6.4 Applications of the Normal Distribution
6.5 Normal Approximation to the Binomial
6.6 Gamma and Exponential Distributions
6.7 Chi-Squared Distribution
6.8 Beta Distribution
6.9 Lognormal Distribution (Optional)
6.10 Weibull Distribution (Optional)
6.11 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

7. Functions of Random Variables (Optional)
7.1 Introduction
7.2 Transformations of Variables
7.3 Moments and Moment-Generating Functions

8. Sampling Distributions and More Graphical Tools
8.1 Random Sampling and Sampling Distributions
8.2 Some Important Statistics
8.3 Sampling Distributions
8.4 Sampling Distribution of Means and the Central Limit Theorem
8.5 Sampling Distribution of S2
8.6 t-Distribution
8.7 F-Distribution
8.8 Quantile and Probability Plots
8.9 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

9. One- and Two-Sample Estimation Problems
9.1 Introduction
9.2 Statistical Inference
9.3 Classical Methods of Estimation
9.4 Single Sample: Estimating the Mean
9.5 Standard Error of a Point Estimate
9.6 Prediction Intervals
9.7 Tolerance Limits
9.8 Two Samples: Estimating the Difference Between Two Means
9.9 Paired Observations
9.10 Single Sample: Estimating a Proportion
9.11 Two Samples: Estimating the Difference between Two Proportions
9.12 Single Sample: Estimating the Variance
9.13 Two Samples: Estimating the Ratio of Two Variances
9.14 Maximum Likelihood Estimation (Optional)
9.15 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

10. One- and Two-Sample Tests of Hypotheses
10.1 Statistical Hypotheses: General Concepts
10.2 Testing a Statistical Hypothesis
10.3 The Use of P-Values for Decision Making in Testing Hypotheses
10.4 Single Sample: Tests Concerning a Single Mean
10.5 Two Samples: Tests on Two Means
10.6 Choice of Sample Size for Testing Means
10.7 Graphical Methods for Comparing Means
10.8 One Sample: Test on a Single Proportion
10.9 Two Samples: Tests on Two Proportions
10.10 One- and Two-Sample Tests Concerning Variances
10.11 Goodness-of-Fit Test
10.12 Test for Independence (Categorical Data)
10.13 Test for Homogeneity
10.14 Two-Sample Case Study
10.15 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

11. Simple Linear Regression and Correlation
11.1 Introduction to Linear Regression
11.2 The Simple Linear Regression Model
11.3 Least Squares and the Fitted Model
11.4 Properties of the Least Squares Estimators
11.5 Inferences Concerning the Regression Coefficients
11.6 Prediction
11.7 Choice of a Regression Model
11.8 Analysis-of-Variance Approach
11.9 Test for Linearity of Regression: Data with Repeated Observations 416
11.10 Data Plots and Transformations
11.11 Simple Linear Regression Case Study
11.12 Correlation
11.13 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

12. Multiple Linear Regression and Certain Nonlinear Regression Models
12.1 Introduction
12.2 Estimating the Coefficients
12.3 Linear Regression Model Using Matrices
12.4 Properties of the Least Squares Estimators
12.5 Inferences in Multiple Linear Regression
12.6 Choice of a Fitted Model through Hypothesis Testing
12.7 Special Case of Orthogonality (Optional)
12.8 Categorical or Indicator Variables
12.9 Sequential Methods for Model Selection
12.10 Study of Residuals and Violation of Assumptions
12.11 Cross Validation, Cp, and Other Criteria for Model Selection
12.12 Special Nonlinear Models for Nonideal Conditions
12.13 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

13. One-Factor Experiments: General
13.1 Analysis-of-Variance Technique
13.2 The Strategy of Experimental Design
13.3 One-Way Analysis of Variance: Completely Randomized Design (One-Way ANOVA)
13.4 Tests for the Equality of Several Variances
13.5 Multiple Comparisons
13.6 Comparing a Set of Treatments in Blocks
13.7 Randomized Complete Block Designs
13.8 Graphical Methods and Model Checking
13.9 Data Transformations In Analysis of Variance)
13.10 Random Effects Models
13.11 Case Study
13.12 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

14. Factorial Experiments (Two or More Factors)
14.1 Introduction
14.2 Interaction in the Two-Factor Experiment
14.3 Two-Factor Analysis of Variance
14.4 Three-Factor Experiments
14.5 Factorial Experiments for Random Effects and Mixed Models
14.6 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

15. 2k Factorial Experiments and Fractions
15.1 Introduction
15.2 The 2k Factorial: Calculation of Effects and Analysis of Variance 598
15.3 Nonreplicated 2k Factorial Experiment
15.4 Factorial Experiments in a Regression Setting
15.5 The Orthogonal Design
15.6 Fractional Factorial Experiments
15.7 Analysis of Fractional Factorial Experiments
15.8 Higher Fractions and Screening Designs
15.9 Construction of Resolution III and IV Designs
15.10 Other Two-Level Resolution III Designs; The Plackett-Burman Designs
15.11 Introduction to Response Surface Methodology
15.12 Robust Parameter Design
15.13 Potential Misconceptions and Hazards; Relationship to Material in Other Chapters

16. Nonparametric Statistics
16.1 Nonparametric Tests
16.2 Signed-Rank Test
16.3 Wilcoxon Rank-Sum Test
16.4 Kruskal-Wallis Test
16.5 Runs Test
16.6 Tolerance Limits
16.7 Rank Correlation Coefficient

17. Statistical Quality Control
17.1 Introduction
17.2 Nature of the Control Limits
17.3 Purposes of the Control Chart
17.4 Control Charts for Variables
17.5 Control Charts for Attributes
17.6 Cusum Control Charts


18 Bayesian Statistics
18.1 Bayesian Concepts
18.2 Bayesian Inferences
18.3 Bayes Estimates Using Decision Theory Framework

Bibliography
Index

°æÁ¦°æ¿µ/Àι®»çȸ ºÐ¾ß¿¡¼­ ¸¹Àº ȸ¿øÀÌ ±¸¸ÅÇÑ Ã¥

    ¸®ºä

    0.0 (ÃÑ 0°Ç)

    100ÀÚÆò

    ÀÛ¼º½Ã À¯ÀÇ»çÇ×

    ÆòÁ¡
    0/100ÀÚ
    µî·ÏÇϱâ

    100ÀÚÆò

    10.0
    (ÃÑ 0°Ç)

    ÆǸÅÀÚÁ¤º¸

    • ÀÎÅÍÆÄÅ©µµ¼­¿¡ µî·ÏµÈ ¿ÀǸ¶ÄÏ »óÇ°Àº ±× ³»¿ë°ú Ã¥ÀÓÀÌ ¸ðµÎ ÆǸÅÀÚ¿¡°Ô ÀÖÀ¸¸ç, ÀÎÅÍÆÄÅ©µµ¼­´Â ÇØ´ç »óÇ°°ú ³»¿ë¿¡ ´ëÇØ Ã¥ÀÓÁöÁö ¾Ê½À´Ï´Ù.

    »óÈ£

    (ÁÖ)±³º¸¹®°í

    ´ëÇ¥ÀÚ¸í

    ¾Èº´Çö

    »ç¾÷ÀÚµî·Ï¹øÈ£

    102-81-11670

    ¿¬¶ôó

    1544-1900

    ÀüÀÚ¿ìÆíÁÖ¼Ò

    callcenter@kyobobook.co.kr

    Åë½ÅÆǸž÷½Å°í¹øÈ£

    01-0653

    ¿µ¾÷¼ÒÀçÁö

    ¼­¿ïƯº°½Ã Á¾·Î±¸ Á¾·Î 1(Á¾·Î1°¡,±³º¸ºôµù)

    ±³È¯/ȯºÒ

    ¹ÝÇ°/±³È¯ ¹æ¹ý

    ¡®¸¶ÀÌÆäÀÌÁö > Ãë¼Ò/¹ÝÇ°/±³È¯/ȯºÒ¡¯ ¿¡¼­ ½Åû ¶Ç´Â 1:1 ¹®ÀÇ °Ô½ÃÆÇ ¹× °í°´¼¾ÅÍ(1577-2555)¿¡¼­ ½Åû °¡´É

    ¹ÝÇ°/±³È¯°¡´É ±â°£

    º¯½É ¹ÝÇ°ÀÇ °æ¿ì Ãâ°í¿Ï·á ÈÄ 6ÀÏ(¿µ¾÷ÀÏ ±âÁØ) À̳»±îÁö¸¸ °¡´É
    ´Ü, »óÇ°ÀÇ °áÇÔ ¹× °è¾à³»¿ë°ú ´Ù¸¦ °æ¿ì ¹®Á¦Á¡ ¹ß°ß ÈÄ 30ÀÏ À̳»

    ¹ÝÇ°/±³È¯ ºñ¿ë

    º¯½É ȤÀº ±¸¸ÅÂø¿À·Î ÀÎÇÑ ¹ÝÇ°/±³È¯Àº ¹Ý¼Û·á °í°´ ºÎ´ã
    »óÇ°À̳ª ¼­ºñ½º ÀÚüÀÇ ÇÏÀÚ·Î ÀÎÇÑ ±³È¯/¹ÝÇ°Àº ¹Ý¼Û·á ÆǸÅÀÚ ºÎ´ã

    ¹ÝÇ°/±³È¯ ºÒ°¡ »çÀ¯

    ·¼ÒºñÀÚÀÇ Ã¥ÀÓ ÀÖ´Â »çÀ¯·Î »óÇ° µîÀÌ ¼Õ½Ç ¶Ç´Â ÈÑ¼ÕµÈ °æ¿ì
    (´ÜÁö È®ÀÎÀ» À§ÇÑ Æ÷Àå ÈѼÕÀº Á¦¿Ü)

    ·¼ÒºñÀÚÀÇ »ç¿ë, Æ÷Àå °³ºÀ¿¡ ÀÇÇØ »óÇ° µîÀÇ °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì
    ¿¹) È­ÀåÇ°, ½ÄÇ°, °¡ÀüÁ¦Ç°(¾Ç¼¼¼­¸® Æ÷ÇÔ) µî

    ·º¹Á¦°¡ °¡´ÉÇÑ »óÇ° µîÀÇ Æ÷ÀåÀ» ÈѼÕÇÑ °æ¿ì
    ¿¹) À½¹Ý/DVD/ºñµð¿À, ¼ÒÇÁÆ®¿þ¾î, ¸¸È­Ã¥, ÀâÁö, ¿µ»ó È­º¸Áý

    ·½Ã°£ÀÇ °æ°ú¿¡ ÀÇÇØ ÀçÆǸŰ¡ °ï¶õÇÑ Á¤µµ·Î °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì

    ·ÀüÀÚ»ó°Å·¡ µî¿¡¼­ÀÇ ¼ÒºñÀÚº¸È£¿¡ °üÇÑ ¹ý·üÀÌ Á¤ÇÏ´Â ¼ÒºñÀÚ Ã»¾àöȸ Á¦ÇÑ ³»¿ë¿¡ ÇØ´çµÇ´Â °æ¿ì

    »óÇ° Ç°Àý

    °ø±Þ»ç(ÃâÆÇ»ç) Àç°í »çÁ¤¿¡ ÀÇÇØ Ç°Àý/Áö¿¬µÉ ¼ö ÀÖÀ½

    ¼ÒºñÀÚ ÇÇÇغ¸»ó
    ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó

    ·»óÇ°ÀÇ ºÒ·®¿¡ ÀÇÇÑ ±³È¯, A/S, ȯºÒ, Ç°Áúº¸Áõ ¹× ÇÇÇغ¸»ó µî¿¡ °üÇÑ »çÇ×Àº ¼ÒºñÀÚºÐÀïÇØ°á ±âÁØ (°øÁ¤°Å·¡À§¿øȸ °í½Ã)¿¡ ÁØÇÏ¿© 󸮵Ê

    ·´ë±Ý ȯºÒ ¹× ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó±Ý Áö±Þ Á¶°Ç, ÀýÂ÷ µîÀº ÀüÀÚ»ó°Å·¡ µî¿¡¼­ÀÇ ¼ÒºñÀÚ º¸È£¿¡ °üÇÑ ¹ý·ü¿¡ µû¶ó ó¸®ÇÔ

    (ÁÖ)KGÀ̴Ͻýº ±¸¸Å¾ÈÀü¼­ºñ½º¼­ºñ½º °¡ÀÔ»ç½Ç È®ÀÎ

    (ÁÖ)ÀÎÅÍÆÄÅ©Ä¿¸Ó½º´Â ȸ¿ø´ÔµéÀÇ ¾ÈÀü°Å·¡¸¦ À§ÇØ ±¸¸Å±Ý¾×, °áÁ¦¼ö´Ü¿¡ »ó°ü¾øÀÌ (ÁÖ)ÀÎÅÍÆÄÅ©Ä¿¸Ó½º¸¦ ÅëÇÑ ¸ðµç °Å·¡¿¡ ´ëÇÏ¿©
    (ÁÖ)KGÀ̴Ͻýº°¡ Á¦°øÇÏ´Â ±¸¸Å¾ÈÀü¼­ºñ½º¸¦ Àû¿ëÇÏ°í ÀÖ½À´Ï´Ù.

    ¹è¼Û¾È³»

    • ±³º¸¹®°í »óÇ°Àº Åùè·Î ¹è¼ÛµÇ¸ç, Ãâ°í¿Ï·á 1~2Àϳ» »óÇ°À» ¹Þ¾Æ º¸½Ç ¼ö ÀÖ½À´Ï´Ù.

    • Ãâ°í°¡´É ½Ã°£ÀÌ ¼­·Î ´Ù¸¥ »óÇ°À» ÇÔ²² ÁÖ¹®ÇÒ °æ¿ì Ãâ°í°¡´É ½Ã°£ÀÌ °¡Àå ±ä »óÇ°À» ±âÁØÀ¸·Î ¹è¼ÛµË´Ï´Ù.

    • ±ººÎ´ë, ±³µµ¼Ò µî ƯÁ¤±â°üÀº ¿ìü±¹ Åù踸 ¹è¼Û°¡´ÉÇÕ´Ï´Ù.

    • ¹è¼Ûºñ´Â ¾÷ü ¹è¼Ûºñ Á¤Ã¥¿¡ µû¸¨´Ï´Ù.

    • - µµ¼­ ±¸¸Å ½Ã 15,000¿ø ÀÌ»ó ¹«·á¹è¼Û, 15,000¿ø ¹Ì¸¸ 2,500¿ø - »óÇ°º° ¹è¼Ûºñ°¡ ÀÖ´Â °æ¿ì, »óÇ°º° ¹è¼Ûºñ Á¤Ã¥ Àû¿ë