As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
We describe the reliability of the assessment of the appropriateness of requested diagnostic tests. We used a retrospective random selection of 253 request forms with in total1217 requested tests. Three experts made an independent assessment of each requested test. Interrater kappa values ranged from 0.33 to 0.44. The kappa values of intrarater agreement ranged from 0.65 to 0.68. The reliability coefficient for all three reviewers was 0.66. This reliability is not sufficient to make case-by-case decisions, for example to give individual feedback on the appropriateness of requested tests. Sixteen reviewers are necessary to obtain a reference with a reliability of 0.95.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.