Reliability

The overall consistency of a measure.
  • Reliability refers to the ability of a test or assessment to produce consistently accurate results, time-after-time, regardless of who is performing the assessment. If an assessment cannot produce consistent results, than a professional cannot determine the accuracy of a measurement, compare that measurement to normative data, or reassess and compare measurements taken on two separate dates. Although the psychometrics and statistics used to determine whether a test is truly reliable are a bit complex, the concept behind reliability is fairly simple and falls into to two broad categories.
    • Inter-tester reliability - assesses the agreement (or lack of) between two or more testers in their assessment.
    • Intra-tester reliability - assesses the agreement (or lack of) between test scores from one test administration to the next (administered by a single tester).

Related Terms

No related terms

Synonyms

  • intertester reliability
  • intra-tester reliability
  • intratester reliability
  • inter-tester reliability
  • reliable