°£Æí°áÁ¦, ½Å¿ëÄ«µå û±¸ÇÒÀÎ
ÀÎÅÍÆÄÅ© ·Ôµ¥Ä«µå 5% (91,490¿ø)
(ÃÖ´ëÇÒÀÎ 10¸¸¿ø / Àü¿ù½ÇÀû 40¸¸¿ø)
ºÏÇǴϾð ·Ôµ¥Ä«µå 30% (67,410¿ø)
(ÃÖ´ëÇÒÀÎ 3¸¸¿ø / 3¸¸¿ø ÀÌ»ó °áÁ¦)
NH¼îÇÎ&ÀÎÅÍÆÄÅ©Ä«µå 20% (77,040¿ø)
(ÃÖ´ëÇÒÀÎ 4¸¸¿ø / 2¸¸¿ø ÀÌ»ó °áÁ¦)
Close

Bayesian Data Analysis, 2/e [¾çÀå]

¼Òµæ°øÁ¦

2013³â 9¿ù 9ÀÏ ÀÌÈÄ ´©Àû¼öÄ¡ÀÔ´Ï´Ù.

°øÀ¯Çϱâ
Á¤°¡

107,000¿ø

  • 96,300¿ø (10%ÇÒÀÎ)

    2,890P (3%Àû¸³)

ÇÒÀÎÇýÅÃ
Àû¸³ÇýÅÃ
  • S-Point Àû¸³Àº ¸¶ÀÌÆäÀÌÁö¿¡¼­ Á÷Á¢ ±¸¸ÅÈ®Á¤ÇϽŠ°æ¿ì¸¸ Àû¸³ µË´Ï´Ù.
Ãß°¡ÇýÅÃ
¹è¼ÛÁ¤º¸
  • 5/24(±Ý) À̳» ¹ß¼Û ¿¹Á¤  (¼­¿ï½Ã °­³²±¸ »ï¼º·Î 512)
  • ¹«·á¹è¼Û
ÁÖ¹®¼ö·®
°¨¼Ò Áõ°¡
  • À̺¥Æ®/±âȹÀü

  • ¿¬°üµµ¼­

  • »óÇ°±Ç

AD

¸ñÂ÷

List of modelsp. xv
List of examplesp. xvii
Prefacep. xix
Fundamentals of Bayesian Inferencep. 1
Backgroundp. 3
Overviewp. 3
General notation for statistical inferencep. 4
Bayesian inferencep. 6
Example: inference about a genetic probabilityp. 9
Probability as a measure of uncertaintyp. 11
Example of probability assignment: football point spreadsp. 14
Example of probability assignment: estimating the accuracy of record linkagep. 17
Some useful results from probability theoryp. 22
Summarizing inferences by simulationp. 25
Computation and softwarep. 27
Bibliographic notep. 27
Exercisesp. 29
Single-parameter modelsp. 33
Estimating a probability from binomial datap. 33
Posterior distribution as compromise between data and prior informationp. 36
Summarizing posterior inferencep. 37
Informative prior distributionsp. 39
Example: estimating the probability of a female birth given placenta previap. 43
Estimating the mean of a normal distribution with known variancep. 46
Other standard single-parameter modelsp. 49
Example: informative prior distribution and multilevel structure for estimating cancer ratesp. 55
Noninformative prior distributionsp. 61
Bibliographic notep. 65
Exercisesp. 67
Introduction to multiparameter modelsp. 73
Averaging over 'nuisance parameters'p. 73
Normal data with a noninformative prior distributionp. 74
Normal data with a conjugate prior distributionp. 78
Normal data with a semi-conjugate prior distributionp. 80
The multinomial modelp. 83
The multivariate normal modelp. 85
Example: analysis of a bioassay experimentp. 88
Summary of elementary modeling and computationp. 93
Bibliographic notep. 94
Exercisesp. 95
Large-sample inference and frequency properties of Bayesian inferencep. 101
Normal approximations to the posterior distributionp. 101
Large-sample theoryp. 106
Counterexamples to the theoremsp. 108
Frequency evaluations of Bayesian inferencesp. 111
Bibliographic notep. 113
Exercisesp. 113
Fundamentals of Bayesian Data Analysisp. 115
Hierarchical modelsp. 117
Constructing a parameterized prior distributionp. 118
Exchangeability and setting up hierarchical modelsp. 121
Computation with hierarchical modelsp. 125
Estimating an exchangeable set of parameters from a normal modelp. 131
Example: combining information from educational testing experiments in eight schoolsp. 138
Hierarchical modeling applied to a meta-analysisp. 145
Bibliographic notep. 150
Exercisesp. 152
Model checking and improvementp. 157
The place of model checking in applied Bayesian statisticsp. 157
Do the inferences from the model make sense?p. 158
Is the model consistent with data? Posterior predictive checkingp. 159
Graphical posterior predictive checksp. 165
Numerical posterior predictive checksp. 172
Model expansionp. 177
Model comparisonp. 179
Model checking for the educational testing examplep. 186
Bibliographic notep. 190
Exercisesp. 192
Modeling accounting for data collectionp. 197
Introductionp. 197
Formal models for data collectionp. 200
Ignorabilityp. 203
Sample surveysp. 207
Designed experimentsp. 218
Sensitivity and the role of randomizationp. 223
Observational studiesp. 226
Censoring and truncationp. 231
Discussionp. 236
Bibliographic notep. 237
Exercisesp. 239
Connections and challengesp. 247
Bayesian interpretations of other statistical methodsp. 247
Challenges in Bayesian data analysisp. 252
Bibliographic notep. 255
Exercisesp. 255
General advicep. 259
Setting up probability modelsp. 259
Posterior inferencep. 264
Model evaluationp. 265
Summaryp. 271
Bibliographic notep. 271
Advanced Computationp. 273
Overview of computationp. 275
Crude estimation by ignoring some informationp. 276
Use of posterior simulations in Bayesian data analysisp. 276
Practical issuesp. 278
Exercisesp. 282
Posterior simulationp. 283
Direct simulationp. 283
Markov chain simulationp. 285
The Gibbs samplerp. 287
The Metropolis and Metropolis-Hastings algorithmsp. 289
Building Markov chain algorithms using the Gibbs sampler and Metropolis algorithmp. 292
Inference and assessing convergencep. 294
Example: the hierarchical normal modelp. 299
Efficient Gibbs samplersp. 302
Efficient Metropolis jumping rulesp. 305
Recommended strategy for posterior simulationp. 307
Bibliographic notep. 308
Exercisesp. 310
Approximations based on posterior modesp. 311
Finding posterior modesp. 312
The normal and related mixture approximationsp. 314
Finding marginal posterior modes using EM and related algorithmsp. 317
Approximating conditional and marginal posterior densitiesp. 324
Example: the hierarchical normal model (continued)p. 325
Bibliographic notep. 331
Exercisesp. 332
Special topics in computationp. 335
Advanced techniques for Markov chain simulationp. 335
Numerical integrationp. 340
Importance samplingp. 342
Computing normalizing factorsp. 345
Bibliographic notep. 348
Exercisesp. 349
Regression Modelsp. 351
Introduction to regression modelsp. 353
Introduction and notationp. 353
Bayesian analysis of the classical regression modelp. 355
Example: estimating the advantage of incumbency in U.S. Congressional electionsp. 359
Goals of regression analysisp. 367
Assembling the matrix of explanatory variablesp. 369
Unequal variances and correlationsp. 372
Models for unequal variancesp. 375
Including prior informationp. 382
Bibliographic notep. 385
Exercisesp. 385
Hierarchical linear modelsp. 389
Regression coefficients exchangeable in batchesp. 390
Example: forecasting U.S. Presidential electionsp. 392
General notation for hierarchical linear modelsp. 399
Computationp. 400
Hierarchical modeling as an alternative to selecting predictorsp. 405
Analysis of variancep. 406
Bibliographic notep. 411
Exercisesp. 412
Generalized linear modelsp. 415
Introductionp. 415
Standard generalized linear model likelihoodsp. 416
Setting up and interpreting generalized linear modelsp. 418
Computationp. 421
Example: hierarchical Poisson regression for police stopsp. 425
Example: hierarchical logistic regression for political opinionsp. 428
Models for multinomial responsesp. 430
Loglinear models for multivariate discrete datap. 433
Bibliographic notep. 439
Exercisesp. 440
Models for robust inferencep. 443
Introductionp. 443
Overdispersed versions of standard probability modelsp. 445
Posterior inference and computationp. 448
Robust inference and sensitivity analysis for the educational testing examplep. 451
Robust regression using Student-t errorsp. 455
Bibliographic notep. 457
Exercisesp. 458
Mixture modelsp. 461
Introductionp. 461
Setting up mixture modelsp. 461
Computationp. 465
Example: reaction times and schizophreniap. 466
Bibliographic notep. 477
Multivariate modelsp. 479
Linear regression with multiple outcomesp. 479
Prior distributions for covariance matricesp. 481
Hierarchical multivariate modelsp. 484
Multivariate models for nonnormal datap. 486
Time series and spatial modelsp. 489
Bibliographic notep. 491
Exercisesp. 492
Nonlinear modelsp. 495
Introductionp. 495
Example: serial dilution assayp. 496
Example: population toxicokineticsp. 502
Bibliographic notep. 512
Exercisesp. 513
Models for missing datap. 515
Notationp. 515
Multiple imputationp. 517
Missing data in the multivariate normal and t modelsp. 521
Example: multiple imputation for a series of pollsp. 524
Missing values with counted datap. 531
Example: an opinion poll in Sloveniap. 532
Bibliographic notep. 537
Exercisesp. 538
Decision analysisp. 539
Bayesian decision theory in different contextsp. 540
Using regression predictions: incentives for telephone surveysp. 542
Multistage decision making: medical screeningp. 550
Decision analysis using a hierarchical model: home radon measurement and remediationp. 553
Personal vs. institutional decision analysisp. 565
Bibliographic notep. 566
Exercisesp. 567
Appendixesp. 569
Standard probability distributionsp. 571
Introductionp. 571
Continuous distributionsp. 571
Discrete distributionsp. 580
Bibliographic notep. 582
Outline of proofs of asymptotic theoremsp. 583
Bibliographic notep. 587
Example of computation in R and Bugsp. 589
Getting started with R and Bugsp. 589
Fitting a hierarchical model in Bugsp. 590
Options in the Bugs implementationp. 594
Fitting a hierarchical model in Rp. 598
Further comments on computationp. 605
Bibliographic notep. 606
Referencesp. 609
Author indexp. 645
Subject indexp. 653
Table of Contents provided by Rittenhouse. All Rights Reserved.

ÀúÀÚ¼Ò°³

Gelman, Andrew (Edt)/ Carlin, John B. (Edt)/ Stern [Àú] ½ÅÀ۾˸² SMS½Åû
»ý³â¿ùÀÏ -

ÇØ´çÀÛ°¡¿¡ ´ëÇÑ ¼Ò°³°¡ ¾ø½À´Ï´Ù.

´ëÇб³Àç/Àü¹®¼­Àû ºÐ¾ß¿¡¼­ ¸¹Àº ȸ¿øÀÌ ±¸¸ÅÇÑ Ã¥

    ¸®ºä

    0.0 (ÃÑ 0°Ç)

    100ÀÚÆò

    ÀÛ¼º½Ã À¯ÀÇ»çÇ×

    ÆòÁ¡
    0/100ÀÚ
    µî·ÏÇϱâ

    100ÀÚÆò

    0.0
    (ÃÑ 0°Ç)

    ÆǸÅÀÚÁ¤º¸

    • ÀÎÅÍÆÄÅ©µµ¼­¿¡ µî·ÏµÈ ¿ÀǸ¶ÄÏ »óÇ°Àº ±× ³»¿ë°ú Ã¥ÀÓÀÌ ¸ðµÎ ÆǸÅÀÚ¿¡°Ô ÀÖÀ¸¸ç, ÀÎÅÍÆÄÅ©µµ¼­´Â ÇØ´ç »óÇ°°ú ³»¿ë¿¡ ´ëÇØ Ã¥ÀÓÁöÁö ¾Ê½À´Ï´Ù.

    »óÈ£

    (ÁÖ)±³º¸¹®°í

    ´ëÇ¥ÀÚ¸í

    ¾Èº´Çö

    »ç¾÷ÀÚµî·Ï¹øÈ£

    102-81-11670

    ¿¬¶ôó

    1544-1900

    ÀüÀÚ¿ìÆíÁÖ¼Ò

    callcenter@kyobobook.co.kr

    Åë½ÅÆǸž÷½Å°í¹øÈ£

    01-0653

    ¿µ¾÷¼ÒÀçÁö

    ¼­¿ïƯº°½Ã Á¾·Î±¸ Á¾·Î 1(Á¾·Î1°¡,±³º¸ºôµù)

    ±³È¯/ȯºÒ

    ¹ÝÇ°/±³È¯ ¹æ¹ý

    ¡®¸¶ÀÌÆäÀÌÁö > Ãë¼Ò/¹ÝÇ°/±³È¯/ȯºÒ¡¯ ¿¡¼­ ½Åû ¶Ç´Â 1:1 ¹®ÀÇ °Ô½ÃÆÇ ¹× °í°´¼¾ÅÍ(1577-2555)¿¡¼­ ½Åû °¡´É

    ¹ÝÇ°/±³È¯°¡´É ±â°£

    º¯½É ¹ÝÇ°ÀÇ °æ¿ì Ãâ°í¿Ï·á ÈÄ 6ÀÏ(¿µ¾÷ÀÏ ±âÁØ) À̳»±îÁö¸¸ °¡´É
    ´Ü, »óÇ°ÀÇ °áÇÔ ¹× °è¾à³»¿ë°ú ´Ù¸¦ °æ¿ì ¹®Á¦Á¡ ¹ß°ß ÈÄ 30ÀÏ À̳»

    ¹ÝÇ°/±³È¯ ºñ¿ë

    º¯½É ȤÀº ±¸¸ÅÂø¿À·Î ÀÎÇÑ ¹ÝÇ°/±³È¯Àº ¹Ý¼Û·á °í°´ ºÎ´ã
    »óÇ°À̳ª ¼­ºñ½º ÀÚüÀÇ ÇÏÀÚ·Î ÀÎÇÑ ±³È¯/¹ÝÇ°Àº ¹Ý¼Û·á ÆǸÅÀÚ ºÎ´ã

    ¹ÝÇ°/±³È¯ ºÒ°¡ »çÀ¯

    ·¼ÒºñÀÚÀÇ Ã¥ÀÓ ÀÖ´Â »çÀ¯·Î »óÇ° µîÀÌ ¼Õ½Ç ¶Ç´Â ÈÑ¼ÕµÈ °æ¿ì
    (´ÜÁö È®ÀÎÀ» À§ÇÑ Æ÷Àå ÈѼÕÀº Á¦¿Ü)

    ·¼ÒºñÀÚÀÇ »ç¿ë, Æ÷Àå °³ºÀ¿¡ ÀÇÇØ »óÇ° µîÀÇ °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì
    ¿¹) È­ÀåÇ°, ½ÄÇ°, °¡ÀüÁ¦Ç°(¾Ç¼¼¼­¸® Æ÷ÇÔ) µî

    ·º¹Á¦°¡ °¡´ÉÇÑ »óÇ° µîÀÇ Æ÷ÀåÀ» ÈѼÕÇÑ °æ¿ì
    ¿¹) À½¹Ý/DVD/ºñµð¿À, ¼ÒÇÁÆ®¿þ¾î, ¸¸È­Ã¥, ÀâÁö, ¿µ»ó È­º¸Áý

    ·½Ã°£ÀÇ °æ°ú¿¡ ÀÇÇØ ÀçÆǸŰ¡ °ï¶õÇÑ Á¤µµ·Î °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì

    ·ÀüÀÚ»ó°Å·¡ µî¿¡¼­ÀÇ ¼ÒºñÀÚº¸È£¿¡ °üÇÑ ¹ý·üÀÌ Á¤ÇÏ´Â ¼ÒºñÀÚ Ã»¾àöȸ Á¦ÇÑ ³»¿ë¿¡ ÇØ´çµÇ´Â °æ¿ì

    »óÇ° Ç°Àý

    °ø±Þ»ç(ÃâÆÇ»ç) Àç°í »çÁ¤¿¡ ÀÇÇØ Ç°Àý/Áö¿¬µÉ ¼ö ÀÖÀ½

    ¼ÒºñÀÚ ÇÇÇغ¸»ó
    ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó

    ·»óÇ°ÀÇ ºÒ·®¿¡ ÀÇÇÑ ±³È¯, A/S, ȯºÒ, Ç°Áúº¸Áõ ¹× ÇÇÇغ¸»ó µî¿¡ °üÇÑ »çÇ×Àº ¼ÒºñÀÚºÐÀïÇØ°á ±âÁØ (°øÁ¤°Å·¡À§¿øȸ °í½Ã)¿¡ ÁØÇÏ¿© 󸮵Ê

    ·´ë±Ý ȯºÒ ¹× ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó±Ý Áö±Þ Á¶°Ç, ÀýÂ÷ µîÀº ÀüÀÚ»ó°Å·¡ µî¿¡¼­ÀÇ ¼ÒºñÀÚ º¸È£¿¡ °üÇÑ ¹ý·ü¿¡ µû¶ó ó¸®ÇÔ

    (ÁÖ)KGÀ̴Ͻýº ±¸¸Å¾ÈÀü¼­ºñ½º¼­ºñ½º °¡ÀÔ»ç½Ç È®ÀÎ

    (ÁÖ)ÀÎÅÍÆÄÅ©Ä¿¸Ó½º´Â ȸ¿ø´ÔµéÀÇ ¾ÈÀü°Å·¡¸¦ À§ÇØ ±¸¸Å±Ý¾×, °áÁ¦¼ö´Ü¿¡ »ó°ü¾øÀÌ (ÁÖ)ÀÎÅÍÆÄÅ©Ä¿¸Ó½º¸¦ ÅëÇÑ ¸ðµç °Å·¡¿¡ ´ëÇÏ¿©
    (ÁÖ)KGÀ̴Ͻýº°¡ Á¦°øÇÏ´Â ±¸¸Å¾ÈÀü¼­ºñ½º¸¦ Àû¿ëÇÏ°í ÀÖ½À´Ï´Ù.

    ¹è¼Û¾È³»

    • ±³º¸¹®°í »óÇ°Àº Åùè·Î ¹è¼ÛµÇ¸ç, Ãâ°í¿Ï·á 1~2Àϳ» »óÇ°À» ¹Þ¾Æ º¸½Ç ¼ö ÀÖ½À´Ï´Ù.

    • Ãâ°í°¡´É ½Ã°£ÀÌ ¼­·Î ´Ù¸¥ »óÇ°À» ÇÔ²² ÁÖ¹®ÇÒ °æ¿ì Ãâ°í°¡´É ½Ã°£ÀÌ °¡Àå ±ä »óÇ°À» ±âÁØÀ¸·Î ¹è¼ÛµË´Ï´Ù.

    • ±ººÎ´ë, ±³µµ¼Ò µî ƯÁ¤±â°üÀº ¿ìü±¹ Åù踸 ¹è¼Û°¡´ÉÇÕ´Ï´Ù.

    • ¹è¼Ûºñ´Â ¾÷ü ¹è¼Ûºñ Á¤Ã¥¿¡ µû¸¨´Ï´Ù.

    • - µµ¼­ ±¸¸Å ½Ã 15,000¿ø ÀÌ»ó ¹«·á¹è¼Û, 15,000¿ø ¹Ì¸¸ 2,500¿ø - »óÇ°º° ¹è¼Ûºñ°¡ ÀÖ´Â °æ¿ì, »óÇ°º° ¹è¼Ûºñ Á¤Ã¥ Àû¿ë