Kappa Measurement of Inter-observer Agreement: Cohen's Kappa, °üÂûÀÚ°£ ÀÏÄ¡µµ Ä«ÆÄ ÃøÁ¤: CohenÀÇ Ä«ÆÄ

g1 = a + b
g2 = c + d
f1 = a + c
f2 = b + d
N = f1 + f2
P_Observed = (a + d) / N
P_Expected = ((f1 * g1) + (f2 * g2)) / sq(N)
Kappa = ( P_Observed - P_Expected ) / ( 1 - P_Expected )

ÀԷ°ª:


°üÂûÀÚ 1

¿¹ ¾Æ´Ï¿À
°üÂûÀÚ 2    ¿¹  a:   b: 
¾Æ´Ï¿À  c:   d: 

°á°ú°ª:

g1
g2
f1
f2
N
P_Observed
P_Expected
Kappa
 
Decimal Precision:


Result Ranger® for ÀÏÄ¡ÀÇ °­µµ: Cohen's Kappa

°üÂûÀÚ°£ ÀÏÄ¡µµ¸¦ º¸´Â Ä«ÆÄ ÃøÁ¤°ªÀº ¿ì¿¬È÷ ¹ß»ýÇÒ ¼ö ÀÖ´Â ÀÏÄ¡µµÀÇ Á¤µµ¸¦ ±³Á¤Çϰųª º¸»óÇÕ´Ï´Ù. CohenÀÇ Ä«ÆÄ°¡ ¾Æ¸¶µµ °¡Àå ÈçÈ÷ »ç¿ëµÇ´Â °è»êÀ̰ڽÀ´Ï´Ù¸¸, ¾Æ·¡¿¡ ÀÖ´Â Byrt µîÀÇ ³í¹®Àº Scott°ú BennettÀÇ ¹æ¹ýÀ» Æ÷ÇÔÇÏ¿© ´Ù¾çÇÑ ´Ù¸¥ °è»ê¹æ¹ýÀ» °ËÅäÇϰí ÀÖ½À´Ï´Ù. CohenÀÇ Ä«ÆÄ´Â ¶ÇÇÑ ´ÙÀ½ °è»ê½Ä°ú °°Àº °ªÀ» ³¾ ¼ö ÀÖ½À´Ï´Ù: 2 * ((a * d) - (b * c)) / ((g1 * f2) + (g2 * f1))

ÀԷº¯¼ö´Â °üÂûÀÚµéÀÌ È®½ÇÈ÷ µ¿ÀÇÇϰųª(input A), ȤÀº µ¿ÀÇÇÏÁö ¾Ê°Å³ª(input B), ȤÀº °¡´ÉÇÑ µÎ°¡Áö Á¶ÇÕÀÇ µ¿ÀÇÇÏÁö ¾ÊÀ½(input B¿Í C)À» ÀǹÌÇÕ´Ï´Ù


Âü°í¹®Çå

MedCalc CrossRef: Kappa Multicalc Page

Cohen JA, A coefficient of agreement for nominal scales, Educ Psych Meas, 1960; 20:37-46

Cicchetti DV, Feinstein AR, High agreement but low kappa: II. Resolving the paradoxes, J Clin Epidemiol, 1990;43(6):551-558

Bennett EM, Albert R, Goldstein AC, Communications through limited response questioning, Public Opinion Q, 1954; 18:303-308

Scott WA, Reliability of content analysis: The case of nominal scale coding, Public Opinion Q, 1955; 19:321-325

Byrt T, Bishop J, Carlin JB, Bias, prevalence and kappa, J Clin Epidem, 1993; v46 #5, 423-429


 


¹ýÀû °øÁö ¹× ¸éÃ¥

¸ÞµåĮũ 3000¿¡ Æ÷ÇÔµÇ°í ¸¸µé¾îÁø ¸ðµç Á¤º¸´Â ±³À°Àû ¸ñÀû¸¸À» À§ÇØ Á¦°øµË´Ï´Ù. ÀÌ Á¤º¸´Â ¾î¶°ÇÑ °Ç°­»óÀÇ ¹®Á¦³ª Áúº´À» Áø´ÜÇϰųª Ä¡·áÇϴµ¥ »ç¿ëµÇ¾î¼­´Â ¾ÈµË´Ï´Ù. ¿ì¸®´Â ÀÌ Á¤º¸¸¦ ¾î¶² ÇüÅÂ·Îµç °³°³ÀΠȯÀÚ¸¦ µ¹º¸´Â ÁöħÀ̳ª ÀÓ»óÀû ÆÇ´ÜÀ» ´ë½ÅÇÒ Àǵµ·Î ¸¸µéÁö ¾Ê¾Ò½À´Ï´Ù. ¹ýÀû °øÁö ¹× ¸éÃ¥ÀÇ Àü¹®À» º¸·Á¸é ¿©±â¸¦ Ŭ¸¯Çϼ¼¿ä.
 

MedCalc 3000 is Copyright © 1998-2005 Foundation Internet Services