Thursday, September 11, 2008

Validating Multiple-choice Assessments

This post originally appeared on the now-defunct Central Texas Instructional Design blog on this date.

I said I would soon talk about how the learners tell you what a good question is, but that was a couple of months before I started this post. And it has taken me a while to publish it. “Soon” must be a relative term.

Although assessment validation may have a negative connotation, try to approach it as a collaborative process (Carpenter, 2006). The goal, after all, is to improve the assessment as a tool that measures training effectiveness, to ensure that the assessment is valid, fair, reliable, and effective. Validating an assessment helps ensure that it measures what you want it to measure. 

So how do your learners tell you that a question is effective or not? 

Every learner has an opinion on every question a multiple-choice assessment contains, but they can’t tell you in words. In fact, no single learner can tell you anything useful. To know if a question is good or not, you need to collect a lot of data from a fairly sizable population. 

But before getting into the data requirements, let’s talk about assessment validation techniques. The techniques you choose determine the data you need.

The available techniques fall into one of two categories:

  • Internal validation
  • External validation

Internal validation uses only data gathered from the learners taking the assessment. It compares learner performance on individual questions to performance on the assessment as a whole.  If any of these conditions occur, you know you have a problem:

  • More people that fail the assessment get a particular question right than people who score highly do.
  • Almost everyone gets the question right.
  • Almost everyone gets the question wrong.

Internal validation requires no data gathering outside of the classroom.

External validation compares learner performance on the assessment to some external metric, usually job performance. It requires you to study learners over a much longer period than internal validation.

My experience with external validation and job performance has been that businesses tend to react to external stimuli faster than training materials and assessments can be updated and validated. I have never seen an assessment show significant correlation to job performance (other than one entry assessment, and that was only for two quarters). Without being able to site my data because of non-disclosure agreements, you have only my word that external validation tends to be too unreliable and too expensive to be useful in a corporate training environment.

References

No comments:

Post a Comment