Diploma Thesis: "Improving Computer-Adaptive Psychological Tests"
"Automatic Item Generation and alternative Response Formats"
This is an interdisciplinary effort located at the Chair of Industrial and Organizational Psycholgy (Prof. Dr. Hornke).
The objective is to develop a prototype of a computer-adaptive test (CAT) of analogical reasoning. In contrast to classical fixed length tests, CATs attempt to tailor themselves to the test taker's ability, allowing to balance test length and precision of measurement. The downside of this is the requirement of a larger item pool. This problem can be tackled by generating items automatically on demand, based on an empirically validated difficulty model, a theoretical cognition-based framework, or both.
Another interesting aspect of tests are the response formats used: multiple-choice items, though common and familiar, are biased by guessing chance and alternative response strategies like response elimination strategies. Their major advantage is the ease of scoring, allowing for immediate feedback in computerized tests. Alternatives to selected-response formats like multiple-choice are constructed-response formats. In this case, no response options are provided, but the test taker has to construct his/her own solution. The drawback of constructed-response formats is that scoring can usually not be automated.
In this thesis, we consider three different response formats in an attempt to find a balanced combination of multiple-choice and constructed-response characteristics.
For more information, the proposal is available (see attached file).
The final presentation (190.82 kB) is currently restricted to internal use.