¿Ü±¹µµ¼
´ëÇб³Àç/Àü¹®¼Àû
ÀÚ¿¬°úÇÐ/¼ø¼ö°úÇаè¿
2013³â 9¿ù 9ÀÏ ÀÌÈÄ ´©Àû¼öÄ¡ÀÔ´Ï´Ù.
Á¤°¡ |
99,000¿ø |
---|
99,000¿ø
2,970P (3%Àû¸³)
ÇÒÀÎÇýÅÃ | |
---|---|
Àû¸³ÇýÅà |
|
|
|
Ãß°¡ÇýÅÃ |
|
À̺¥Æ®/±âȹÀü
¿¬°üµµ¼
»óǰ±Ç
ÀÌ»óǰÀÇ ºÐ·ù
¸ñÂ÷
Introduction
Overview of supervised learning
Linear methods for regression
Linear methods for classification
Basis expansions and regularization
Kernel smoothing methods
Model assessment and selection
Model inference and averaging
Additive models, trees, and related methods
Boosting and additive trees
Neural networks
Support vector machines and flexible discriminants
Prototype methods and nearest-neighbors
Unsupervised learning
Table of Contents provided by Publisher. All Rights Reserved.
Ã¥¼Ò°³
During the past decade there has been an explosion in computation and information technology. With it have come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It is a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting---the first comprehensive treatment of this topic in any book.
This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression & path algorithms for the lasso, non-negative matrix factorization, and spectral clustering. There is also a chapter on methods for ``wide'' data (p bigger than n), including multiple testing and false discovery rates.
ÀúÀÚ¼Ò°³
»ý³â¿ùÀÏ | - |
---|
ÇØ´çÀÛ°¡¿¡ ´ëÇÑ ¼Ò°³°¡ ¾ø½À´Ï´Ù.
»ý³â¿ùÀÏ | - |
---|
ÇØ´çÀÛ°¡¿¡ ´ëÇÑ ¼Ò°³°¡ ¾ø½À´Ï´Ù.
»ý³â¿ùÀÏ | - |
---|
ÇØ´çÀÛ°¡¿¡ ´ëÇÑ ¼Ò°³°¡ ¾ø½À´Ï´Ù.
ÀúÀÚÀÇ ´Ù¸¥Ã¥
Àüüº¸±âAn Introduction to Statistical Learning
ÁÖ°£·©Å·
´õº¸±â»óǰÁ¤º¸Á¦°ø°í½Ã
À̺¥Æ® ±âȹÀü
´ëÇб³Àç/Àü¹®¼Àû ºÐ¾ß¿¡¼ ¸¹Àº ȸ¿øÀÌ ±¸¸ÅÇÑ Ã¥
ÆÇ¸ÅÀÚÁ¤º¸
»óÈ£ |
(ÁÖ)±³º¸¹®°í |
---|---|
´ëÇ¥ÀÚ¸í |
¾Èº´Çö |
»ç¾÷ÀÚµî·Ï¹øÈ£ |
102-81-11670 |
¿¬¶ôó |
1544-1900 |
ÀüÀÚ¿ìÆíÁÖ¼Ò |
callcenter@kyobobook.co.kr |
Åë½ÅÆÇ¸Å¾÷½Å°í¹øÈ£ |
01-0653 |
¿µ¾÷¼ÒÀçÁö |
¼¿ïƯº°½Ã Á¾·Î±¸ Á¾·Î 1(Á¾·Î1°¡,±³º¸ºôµù) |
±³È¯/ȯºÒ
¹Ýǰ/±³È¯ ¹æ¹ý |
¡®¸¶ÀÌÆäÀÌÁö > Ãë¼Ò/¹Ýǰ/±³È¯/ȯºÒ¡¯ ¿¡¼ ½Åû ¶Ç´Â 1:1 ¹®ÀÇ °Ô½ÃÆÇ ¹× °í°´¼¾ÅÍ(1577-2555)¿¡¼ ½Åû °¡´É |
---|---|
¹Ýǰ/±³È¯°¡´É ±â°£ |
º¯½É ¹ÝǰÀÇ °æ¿ì Ãâ°í¿Ï·á ÈÄ 6ÀÏ(¿µ¾÷ÀÏ ±âÁØ) À̳»±îÁö¸¸ °¡´É |
¹Ýǰ/±³È¯ ºñ¿ë |
º¯½É ȤÀº ±¸¸ÅÂø¿À·Î ÀÎÇÑ ¹Ýǰ/±³È¯Àº ¹Ý¼Û·á °í°´ ºÎ´ã |
¹Ýǰ/±³È¯ ºÒ°¡ »çÀ¯ |
·¼ÒºñÀÚÀÇ Ã¥ÀÓ ÀÖ´Â »çÀ¯·Î »óǰ µîÀÌ ¼Õ½Ç ¶Ç´Â ÈÑ¼ÕµÈ °æ¿ì ·¼ÒºñÀÚÀÇ »ç¿ë, Æ÷Àå °³ºÀ¿¡ ÀÇÇØ »óǰ µîÀÇ °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì ·º¹Á¦°¡ °¡´ÉÇÑ »óǰ µîÀÇ Æ÷ÀåÀ» ÈѼÕÇÑ °æ¿ì ·½Ã°£ÀÇ °æ°ú¿¡ ÀÇÇØ ÀçÆÇ¸Å°¡ °ï¶õÇÑ Á¤µµ·Î °¡Ä¡°¡ ÇöÀúÈ÷ °¨¼ÒÇÑ °æ¿ì ·ÀüÀÚ»ó°Å·¡ µî¿¡¼ÀÇ ¼ÒºñÀÚº¸È£¿¡ °üÇÑ ¹ý·üÀÌ Á¤ÇÏ´Â ¼ÒºñÀÚ Ã»¾àöȸ Á¦ÇÑ ³»¿ë¿¡ ÇØ´çµÇ´Â °æ¿ì |
»óǰ ǰÀý |
°ø±Þ»ç(ÃâÆÇ»ç) Àç°í »çÁ¤¿¡ ÀÇÇØ ǰÀý/Áö¿¬µÉ ¼ö ÀÖÀ½ |
¼ÒºñÀÚ ÇÇÇØº¸»ó |
·»óǰÀÇ ºÒ·®¿¡ ÀÇÇÑ ±³È¯, A/S, ȯºÒ, ǰÁúº¸Áõ ¹× ÇÇÇØº¸»ó µî¿¡ °üÇÑ »çÇ×Àº¼ÒºñÀÚºÐÀïÇØ°á ±âÁØ (°øÁ¤°Å·¡À§¿øÈ¸ °í½Ã)¿¡ ÁØÇÏ¿© ó¸®µÊ ·´ë±Ý ȯºÒ ¹× ȯºÒÁö¿¬¿¡ µû¸¥ ¹è»ó±Ý Áö±Þ Á¶°Ç, ÀýÂ÷ µîÀº ÀüÀÚ»ó°Å·¡ µî¿¡¼ÀǼҺñÀÚ º¸È£¿¡ °üÇÑ ¹ý·ü¿¡ µû¶ó ó¸®ÇÔ |
(ÁÖ)ÀÎÅÍÆÄũĿ¸Ó½º´Â ȸ¿ø´ÔµéÀÇ ¾ÈÀü°Å·¡¸¦ À§ÇØ ±¸¸Å±Ý¾×, °áÁ¦¼ö´Ü¿¡ »ó°ü¾øÀÌ (ÁÖ)ÀÎÅÍÆÄũĿ¸Ó½º¸¦ ÅëÇÑ
¸ðµç °Å·¡¿¡ ´ëÇÏ¿© (ÁÖ)KGÀ̴Ͻýº°¡ Á¦°øÇÏ´Â ±¸¸Å¾ÈÀü¼ºñ½º¸¦ Àû¿ëÇϰí ÀÖ½À´Ï´Ù.
µî·Ï ¿©ºÎ´Â e-±ÝÀ¶¹Î¿ø¼¾ÅÍ È¨ÆäÀÌÁö(www.fcsc.kr)ÀÇ µî·Ï¡¤½Å°í>ÀüÀÚ±ÝÀ¶¾÷µî·ÏÇöȲ ¸Þ´º¿¡¼ È®ÀÎÇÏ½Ç ¼ö ÀÖ½À´Ï´Ù.
¹è¼Û¾È³»
±³º¸¹®°í »óǰÀº Åùè·Î ¹è¼ÛµÇ¸ç, Ãâ°í¿Ï·á 1~2Àϳ» »óǰÀ» ¹Þ¾Æ º¸½Ç ¼ö ÀÖ½À´Ï´Ù.
Ãâ°í°¡´É ½Ã°£ÀÌ ¼·Î ´Ù¸¥ »óǰÀ» ÇÔ²² ÁÖ¹®ÇÒ °æ¿ì Ãâ°í°¡´É ½Ã°£ÀÌ °¡Àå ±ä »óǰÀ» ±âÁØÀ¸·Î ¹è¼ÛµË´Ï´Ù.
±ººÎ´ë, ±³µµ¼Ò µî ƯÁ¤±â°üÀº ¿ìü±¹ Åù踸 ¹è¼Û°¡´ÉÇÕ´Ï´Ù.
¹è¼Ûºñ´Â ¾÷ü ¹è¼Ûºñ Á¤Ã¥¿¡ µû¸¨´Ï´Ù.