Using scientific research to improve H5P: Creating “Discrete Option Multiple Choice”

Please note: This is not a blog post intended to be self explanatory, but it merely is a place where I dumped some material that is supposed to accompany a talk. So if you think that something is missing here, then you are probably correct 🙂


You will know this: A traditional multiple choice question created with H5P (TDMC)


The new kid in town: Discrete Option Multiple Choice (DOMC)

See details at https://www.olivertacke.de/labs/2023/04/28/a-new-h5p-content-type-discrete-option-multiple-choice/


Test Wiseness: Comparison of TDMC and DOMC

Have a look at the following questions. You will probably find that the traditional multiple choice questions can be solved quite well with a little bit of “test wiseness” even if you don’t know anything about the subject. What do you think about this issue when looking at the variants using the discrete option multiple choice format?

Traditional Multiple Choice

Discrete Option Multiple Choice


Some research results

Foster & Miller (2009)

Foster, D., & Miller Jr, H. L. (2009). A new format for multiple-choice testing: Discrete-Option Multiple-Choice. Results from early studies. Psychological Test and Assessment Modeling, 51(4), 355.

The effects were seen in the increased difficulty and discriminative ability of items, reduction of the time needed to complete the assessment, and the improved security of the assessment.
One other effect is worth noting and may deserve a research program all its own. Because it reduces the effects of test-taking skills as well as attempts at testing fraud, the DOMC format improves the fairness of the assessment.

Kingston, Tiemann, Miller & Foster (2012)

Kingston, N. M., Tiemann, G. C., Miller Jr, H. L., & Foster, D. (2012). An analysis of the discrete-option multiple-choice item type. Psychological Test and Assessment Modeling, 54(1), 3.

Results showed that, across all forms, MC items were consistently easier than DOMC items. Mean scores from DOMC item sets were also lower than for MC sets, which could possibly be attributed to a reduced impact of testwiseness in responding to DOMC items or alternatively to the fact that examinees cannot revisit options once they make a choice to select or not select an option.

Based on the results of this study, there appear to be no psychometric reasons for excluding DOMC items from testing programs.

Willing (2013)

Willing, S. (2013). Discrete-option multiple-choice: Evaluating the psychometric properties of a new method of knowledge assessment (Doctoral dissertation).

Taken together, the psychometric properties of DOMC testing did not surpass but were able to match those of the format hitherto considered to be the most valid for an objective assessment of knowledge. In view of some of its unique new features, the sequential answering format therefore seems to offer a promising alternative to the traditional MC format.

Bolt et al. (2020)

Bolt, D. M., Kim, N., Wollack, J., Pan, Y., Eckerly, C., & Sowles, J. (2020). A psychometric model for discrete-option multiple-choice items. Applied Psychological Measurement, 44(1), 33-48.

To the extent that incorrect responses on DOMC items can occur in either of two ways (i.e., not selecting a keyed response OR selecting a distractor), the correctness of a DOMC item score can be viewed as the outcome of conjunctively interacting processes, namely, successes in ruling out distractors AND successes in selecting keyed responses.

Eckerly, Smith & Sowles (2018)

Eckerly, C., Smith, R., & Sowles, J. (2018). Fairness Concerns of Discrete Option Multiple Choice Items. Practical Assessment, Research & Evaluation, 23(16).

We have shown that item difficulty and discrimination varied substantially for the DOMC items in this dataset, depending on key position, leading test takers to see forms of varying difficulty and reliability.

We recommend that testing programs not use DOMC items until a methodology is developed to address the fairness and measurement model fit issues addressed in this paper. As shown in this study, without doing so can introduce significant fairness issues with respect to varying item difficulty and discrimination.


H5P would not be H5P without some more options 🙂

Let’s have a look what you can change!

For instance, ask the student to state how confident she/he is in her/his answer in order to improve self-reflection, self-assessment, and the ability to recognize one’s own strengths and weaknesses (metacognitive monitoring) – again backed up by scientific research (Barenberg, J., & Dutke, S. (2022). Testen als evidenzbasierte Lernmethode: Empirische und theoretische Gründe für eine Anwendung im Unterricht. Unterrichtswissenschaft, 50(1), 17-36.)

 

4 Replies to “Using scientific research to improve H5P: Creating “Discrete Option Multiple Choice””

  1. This is great, I’ll have to read through those papers.

    Meanwhile, I love the UI for a rapid-fire True or False self assessment. You could call it “lighting round” or something like that.

    Using the options fields for true or false questions, “Present all options” and “Only one item visible at a time”, only option missing right now is that wrong answers don’t give you a negative score.

    Great work!

    1. I see. One would need to evaluate whether such a “Speedy True/False” content type would meet a high demand. I could at least think of a couple of extra features, such as a time limit, but then again, without point deduction or at least documenting the number of wrong answers, one would rather test clicking speed instead of knowledge. This is probably something that an instructional designer together with a UX designer should look into first.

Leave a Reply

Your email address will not be published. Required fields are marked *