¿Ü±¹µµ¼
´ëÇб³Àç/Àü¹®¼Àû
ÀÚ¿¬°úÇÐ/¼ø¼ö°úÇаè¿
2013³â 9¿ù 9ÀÏ ÀÌÈÄ ´©Àû¼öÄ¡ÀÔ´Ï´Ù.
Á¤°¡ |
107,000¿ø |
---|
96,300¿ø (10%ÇÒÀÎ)
2,890P (3%Àû¸³)
ÇÒÀÎÇýÅÃ | |
---|---|
Àû¸³ÇýÅà |
|
|
|
Ãß°¡ÇýÅÃ |
|
À̺¥Æ®/±âȹÀü
¿¬°üµµ¼
»óÇ°±Ç
ÀÌ»óÇ°ÀÇ ºÐ·ù
¸ñÂ÷
List of models | p. xv |
List of examples | p. xvii |
Preface | p. xix |
Fundamentals of Bayesian Inference | p. 1 |
Background | p. 3 |
Overview | p. 3 |
General notation for statistical inference | p. 4 |
Bayesian inference | p. 6 |
Example: inference about a genetic probability | p. 9 |
Probability as a measure of uncertainty | p. 11 |
Example of probability assignment: football point spreads | p. 14 |
Example of probability assignment: estimating the accuracy of record linkage | p. 17 |
Some useful results from probability theory | p. 22 |
Summarizing inferences by simulation | p. 25 |
Computation and software | p. 27 |
Bibliographic note | p. 27 |
Exercises | p. 29 |
Single-parameter models | p. 33 |
Estimating a probability from binomial data | p. 33 |
Posterior distribution as compromise between data and prior information | p. 36 |
Summarizing posterior inference | p. 37 |
Informative prior distributions | p. 39 |
Example: estimating the probability of a female birth given placenta previa | p. 43 |
Estimating the mean of a normal distribution with known variance | p. 46 |
Other standard single-parameter models | p. 49 |
Example: informative prior distribution and multilevel structure for estimating cancer rates | p. 55 |
Noninformative prior distributions | p. 61 |
Bibliographic note | p. 65 |
Exercises | p. 67 |
Introduction to multiparameter models | p. 73 |
Averaging over 'nuisance parameters' | p. 73 |
Normal data with a noninformative prior distribution | p. 74 |
Normal data with a conjugate prior distribution | p. 78 |
Normal data with a semi-conjugate prior distribution | p. 80 |
The multinomial model | p. 83 |
The multivariate normal model | p. 85 |
Example: analysis of a bioassay experiment | p. 88 |
Summary of elementary modeling and computation | p. 93 |
Bibliographic note | p. 94 |
Exercises | p. 95 |
Large-sample inference and frequency properties of Bayesian inference | p. 101 |
Normal approximations to the posterior distribution | p. 101 |
Large-sample theory | p. 106 |
Counterexamples to the theorems | p. 108 |
Frequency evaluations of Bayesian inferences | p. 111 |
Bibliographic note | p. 113 |
Exercises | p. 113 |
Fundamentals of Bayesian Data Analysis | p. 115 |
Hierarchical models | p. 117 |
Constructing a parameterized prior distribution | p. 118 |
Exchangeability and setting up hierarchical models | p. 121 |
Computation with hierarchical models | p. 125 |
Estimating an exchangeable set of parameters from a normal model | p. 131 |
Example: combining information from educational testing experiments in eight schools | p. 138 |
Hierarchical modeling applied to a meta-analysis | p. 145 |
Bibliographic note | p. 150 |
Exercises | p. 152 |
Model checking and improvement | p. 157 |
The place of model checking in applied Bayesian statistics | p. 157 |
Do the inferences from the model make sense? | p. 158 |
Is the model consistent with data? Posterior predictive checking | p. 159 |
Graphical posterior predictive checks | p. 165 |
Numerical posterior predictive checks | p. 172 |
Model expansion | p. 177 |
Model comparison | p. 179 |
Model checking for the educational testing example | p. 186 |
Bibliographic note | p. 190 |
Exercises | p. 192 |
Modeling accounting for data collection | p. 197 |
Introduction | p. 197 |
Formal models for data collection | p. 200 |
Ignorability | p. 203 |
Sample surveys | p. 207 |
Designed experiments | p. 218 |
Sensitivity and the role of randomization | p. 223 |
Observational studies | p. 226 |
Censoring and truncation | p. 231 |
Discussion | p. 236 |
Bibliographic note | p. 237 |
Exercises | p. 239 |
Connections and challenges | p. 247 |
Bayesian interpretations of other statistical methods | p. 247 |
Challenges in Bayesian data analysis | p. 252 |
Bibliographic note | p. 255 |
Exercises | p. 255 |
General advice | p. 259 |
Setting up probability models | p. 259 |
Posterior inference | p. 264 |
Model evaluation | p. 265 |
Summary | p. 271 |
Bibliographic note | p. 271 |
Advanced Computation | p. 273 |
Overview of computation | p. 275 |
Crude estimation by ignoring some information | p. 276 |
Use of posterior simulations in Bayesian data analysis | p. 276 |
Practical issues | p. 278 |
Exercises | p. 282 |
Posterior simulation | p. 283 |
Direct simulation | p. 283 |
Markov chain simulation | p. 285 |
The Gibbs sampler | p. 287 |
The Metropolis and Metropolis-Hastings algorithms | p. 289 |
Building Markov chain algorithms using the Gibbs sampler and Metropolis algorithm | p. 292 |
Inference and assessing convergence | p. 294 |
Example: the hierarchical normal model | p. 299 |
Efficient Gibbs samplers | p. 302 |
Efficient Metropolis jumping rules | p. 305 |
Recommended strategy for posterior simulation | p. 307 |
Bibliographic note | p. 308 |
Exercises | p. 310 |
Approximations based on posterior modes | p. 311 |
Finding posterior modes | p. 312 |
The normal and related mixture approximations | p. 314 |
Finding marginal posterior modes using EM and related algorithms | p. 317 |
Approximating conditional and marginal posterior densities | p. 324 |
Example: the hierarchical normal model (continued) | p. 325 |
Bibliographic note | p. 331 |
Exercises | p. 332 |
Special topics in computation | p. 335 |
Advanced techniques for Markov chain simulation | p. 335 |
Numerical integration | p. 340 |
Importance sampling | p. 342 |
Computing normalizing factors | p. 345 |
Bibliographic note | p. 348 |
Exercises | p. 349 |
Regression Models | p. 351 |
Introduction to regression models | p. 353 |
Introduction and notation | p. 353 |
Bayesian analysis of the classical regression model | p. 355 |
Example: estimating the advantage of incumbency in U.S. Congressional elections | p. 359 |
Goals of regression analysis | p. 367 |
Assembling the matrix of explanatory variables | p. 369 |
Unequal variances and correlations | p. 372 |
Models for unequal variances | p. 375 |
Including prior information | p. 382 |
Bibliographic note | p. 385 |
Exercises | p. 385 |
Hierarchical linear models | p. 389 |
Regression coefficients exchangeable in batches | p. 390 |
Example: forecasting U.S. Presidential elections | p. 392 |
General notation for hierarchical linear models | p. 399 |
Computation | p. 400 |
Hierarchical modeling as an alternative to selecting predictors | p. 405 |
Analysis of variance | p. 406 |
Bibliographic note | p. 411 |
Exercises | p. 412 |
Generalized linear models | p. 415 |
Introduction | p. 415 |
Standard generalized linear model likelihoods | p. 416 |
Setting up and interpreting generalized linear models | p. 418 |
Computation | p. 421 |
Example: hierarchical Poisson regression for police stops | p. 425 |
Example: hierarchical logistic regression for political opinions | p. 428 |
Models for multinomial responses | p. 430 |
Loglinear models for multivariate discrete data | p. 433 |
Bibliographic note | p. 439 |
Exercises | p. 440 |
Models for robust inference | p. 443 |
Introduction | p. 443 |
Overdispersed versions of standard probability models | p. 445 |
Posterior inference and computation | p. 448 |
Robust inference and sensitivity analysis for the educational testing example | p. 451 |
Robust regression using Student-t errors | p. 455 |
Bibliographic note | p. 457 |
Exercises | p. 458 |
Mixture models | p. 461 |
Introduction | p. 461 |
Setting up mixture models | p. 461 |
Computation | p. 465 |
Example: reaction times and schizophrenia | p. 466 |
Bibliographic note | p. 477 |
Multivariate models | p. 479 |
Linear regression with multiple outcomes | p. 479 |
Prior distributions for covariance matrices | p. 481 |
Hierarchical multivariate models | p. 484 |
Multivariate models for nonnormal data | p. 486 |
Time series and spatial models | p. 489 |
Bibliographic note | p. 491 |
Exercises | p. 492 |
Nonlinear models | p. 495 |
Introduction | p. 495 |
Example: serial dilution assay | p. 496 |
Example: population toxicokinetics | p. 502 |
Bibliographic note | p. 512 |
Exercises | p. 513 |
Models for missing data | p. 515 |
Notation | p. 515 |
Multiple imputation | p. 517 |
Missing data in the multivariate normal and t models | p. 521 |
Example: multiple imputation for a series of polls | p. 524 |
Missing values with counted data | p. 531 |
Example: an opinion poll in Slovenia | p. 532 |
Bibliographic note | p. 537 |
Exercises | p. 538 |
Decision analysis | p. 539 |
Bayesian decision theory in different contexts | p. 540 |
Using regression predictions: incentives for telephone surveys | p. 542 |
Multistage decision making: medical screening | p. 550 |
Decision analysis using a hierarchical model: home radon measurement and remediation | p. 553 |
Personal vs. institutional decision analysis | p. 565 |
Bibliographic note | p. 566 |
Exercises | p. 567 |
Appendixes | p. 569 |
Standard probability distributions | p. 571 |
Introduction | p. 571 |
Continuous distributions | p. 571 |
Discrete distributions | p. 580 |
Bibliographic note | p. 582 |
Outline of proofs of asymptotic theorems | p. 583 |
Bibliographic note | p. 587 |
Example of computation in R and Bugs | p. 589 |
Getting started with R and Bugs | p. 589 |
Fitting a hierarchical model in Bugs | p. 590 |
Options in the Bugs implementation | p. 594 |
Fitting a hierarchical model in R | p. 598 |
Further comments on computation | p. 605 |
Bibliographic note | p. 606 |
References | p. 609 |
Author index | p. 645 |
Subject index | p. 653 |
Table of Contents provided by Rittenhouse. All Rights Reserved. |
ÀúÀÚ¼Ò°³
»ý³â¿ùÀÏ | - |
---|
ÇØ´çÀÛ°¡¿¡ ´ëÇÑ ¼Ò°³°¡ ¾ø½À´Ï´Ù.
ÁÖ°£·©Å·
´õº¸±â»óÇ°Á¤º¸Á¦°ø°í½Ã
À̺¥Æ® ±âȹÀü
´ëÇб³Àç/Àü¹®¼Àû ºÐ¾ß¿¡¼ ¸¹Àº ȸ¿øÀÌ ±¸¸ÅÇÑ Ã¥
ÆǸÅÀÚÁ¤º¸
»óÈ£ |
(ÁÖ)±³º¸¹®°í |
---|---|
´ëÇ¥ÀÚ¸í |
¾Èº´Çö |
»ç¾÷ÀÚµî·Ï¹øÈ£ |
102-81-11670 |
¿¬¶ôó |
1544-1900 |
ÀüÀÚ¿ìÆíÁÖ¼Ò |
callcenter@kyobobook.co.kr |
Åë½ÅÆǸž÷½Å°í¹øÈ£ |
01-0653 |
¿µ¾÷¼ÒÀçÁö |
¼¿ïƯº°½Ã Á¾·Î±¸ Á¾·Î 1(Á¾·Î1°¡,±³º¸ºôµù) |
±³È¯/ȯºÒ
¹ÝÇ°/±³È¯ ¹æ¹ý |
¡®¸¶ÀÌÆäÀÌÁö > Ãë¼Ò/¹ÝÇ°/±³È¯/ȯºÒ¡¯ ¿¡¼ ½Åû ¶Ç´Â 1:1 ¹®ÀÇ °Ô½ÃÆÇ ¹× °í°´¼¾ÅÍ(1577-2555)¿¡¼ ½Åû °¡´É |
---|---|
¹ÝÇ°/±³È¯°¡´É ±â°£ |
º¯½É ¹ÝÇ°ÀÇ °æ¿ì Ãâ°í¿Ï·á ÈÄ 6ÀÏ(¿µ¾÷ÀÏ ±âÁØ) À̳»±îÁö¸¸ °¡´É |
¹ÝÇ°/±³È¯ ºñ¿ë |
º¯½É ȤÀº ±¸¸ÅÂø¿À·Î ÀÎÇÑ ¹ÝÇ°/±³È¯Àº ¹Ý¼Û·á °í°´ ºÎ´ã |
¹ÝÇ°/±³È¯ ºÒ°¡ »çÀ¯ |
·¼ÒºñÀÚÀÇ Ã¥ÀÓ ÀÖ´Â »çÀ¯·Î »óÇ° µîÀÌ ¼Õ½Ç ¶Ç´Â ÈÑ¼ÕµÈ °æ¿ì ·¼ÒºñÀÚÀÇ »ç¿ë, Æ÷Àå °³ºÀ¿¡ ÀÇÇØ »óÇ° µîÀÇ °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì ·º¹Á¦°¡ °¡´ÉÇÑ »óÇ° µîÀÇ Æ÷ÀåÀ» ÈѼÕÇÑ °æ¿ì ·½Ã°£ÀÇ °æ°ú¿¡ ÀÇÇØ ÀçÆǸŰ¡ °ï¶õÇÑ Á¤µµ·Î °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì ·ÀüÀÚ»ó°Å·¡ µî¿¡¼ÀÇ ¼ÒºñÀÚº¸È£¿¡ °üÇÑ ¹ý·üÀÌ Á¤ÇÏ´Â ¼ÒºñÀÚ Ã»¾àöȸ Á¦ÇÑ ³»¿ë¿¡ ÇØ´çµÇ´Â °æ¿ì |
»óÇ° Ç°Àý |
°ø±Þ»ç(ÃâÆÇ»ç) Àç°í »çÁ¤¿¡ ÀÇÇØ Ç°Àý/Áö¿¬µÉ ¼ö ÀÖÀ½ |
¼ÒºñÀÚ ÇÇÇغ¸»ó |
·»óÇ°ÀÇ ºÒ·®¿¡ ÀÇÇÑ ±³È¯, A/S, ȯºÒ, Ç°Áúº¸Áõ ¹× ÇÇÇغ¸»ó µî¿¡ °üÇÑ »çÇ×Àº¼ÒºñÀÚºÐÀïÇØ°á ±âÁØ (°øÁ¤°Å·¡À§¿øȸ °í½Ã)¿¡ ÁØÇÏ¿© ó¸®µÊ ·´ë±Ý ȯºÒ ¹× ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó±Ý Áö±Þ Á¶°Ç, ÀýÂ÷ µîÀº ÀüÀÚ»ó°Å·¡ µî¿¡¼ÀǼҺñÀÚ º¸È£¿¡ °üÇÑ ¹ý·ü¿¡ µû¶ó ó¸®ÇÔ |
(ÁÖ)ÀÎÅÍÆÄÅ©Ä¿¸Ó½º´Â ȸ¿ø´ÔµéÀÇ ¾ÈÀü°Å·¡¸¦ À§ÇØ ±¸¸Å±Ý¾×, °áÁ¦¼ö´Ü¿¡ »ó°ü¾øÀÌ (ÁÖ)ÀÎÅÍÆÄÅ©Ä¿¸Ó½º¸¦ ÅëÇÑ ¸ðµç °Å·¡¿¡ ´ëÇÏ¿©
(ÁÖ)KGÀ̴Ͻýº°¡ Á¦°øÇÏ´Â ±¸¸Å¾ÈÀü¼ºñ½º¸¦ Àû¿ëÇÏ°í ÀÖ½À´Ï´Ù.
¹è¼Û¾È³»
±³º¸¹®°í »óÇ°Àº Åùè·Î ¹è¼ÛµÇ¸ç, Ãâ°í¿Ï·á 1~2Àϳ» »óÇ°À» ¹Þ¾Æ º¸½Ç ¼ö ÀÖ½À´Ï´Ù.
Ãâ°í°¡´É ½Ã°£ÀÌ ¼·Î ´Ù¸¥ »óÇ°À» ÇÔ²² ÁÖ¹®ÇÒ °æ¿ì Ãâ°í°¡´É ½Ã°£ÀÌ °¡Àå ±ä »óÇ°À» ±âÁØÀ¸·Î ¹è¼ÛµË´Ï´Ù.
±ººÎ´ë, ±³µµ¼Ò µî ƯÁ¤±â°üÀº ¿ìü±¹ Åù踸 ¹è¼Û°¡´ÉÇÕ´Ï´Ù.
¹è¼Ûºñ´Â ¾÷ü ¹è¼Ûºñ Á¤Ã¥¿¡ µû¸¨´Ï´Ù.