TitleMutual Information Item Selection in Adaptive Classification Testing
Publication TypeJournal Article
Year of Publication2007
AuthorsWeissman, A
JournalEducational and Psychological Measurement
Volume67
Number1
Pagination41-58
Abstract

A general approach for item selection in adaptive multiple-category classification tests is provided. The approach uses mutual information (MI), a special case of the Kullback-Leibler distance, or relative entropy. MI works efficiently with the sequential probability ratio test and alleviates the difficulties encountered with using other local- and global-information measures in the multiple-category classification setting. Results from simulation studies using three item selection methods, Fisher information (FI), posterior-weighted FI (FIP), and MI, are provided for an adaptive four-category classification test. Both across and within the four classification categories, it is shown that in general, MI item selection classifies the highest proportion of examinees correctly and yields the shortest test lengths. The next best performance is observed for FIP item selection, followed by FI.

URLhttp://epm.sagepub.com/content/67/1/41.abstract
DOI10.1177/0013164406288164