Stop Student Complaining
By Improving Test Question Quality

 

 

Karen R. Young
Assistant Dean and Director of Undergraduate Programs
College of Humanities and Social Sciences

 

  Emily Wicker Ligon
Lead Instructional Designer
Distance Education & Learning Technology Applications (DELTA)

 

  Diane Chapman
Teaching Associate Professor
Department of Leadership, Policy and Adult and Higher Education
Director, Office of Faculty Development

 

  Henry Schaffer
Professor Emeritus of Genetics and Biomathematics
Coordinator of Special OIT Projects & Faculty Collaboration

 

 

OFD Workshop

February 12, 2014

 

 

Introductions

Why is this workshop important?

Learning Objectives for this Workshop

Purpose of Test Items (Formative and Summative) Examples of bad questions Item Analysis Discussion of Strategy in Selecting Questions - difficulty levels, pbs, ... Item Analysis Services on Campus Improving test items
In addition to working on improvement after considering Item Analysis, here are some basic MCQ suggestions taken from TESTING AND EVALUATION IN THE BIOLOGICAL SCIENCES COMMISSION ON UNDERGRADUATE EDUCATION IN THE BIOLOGICAL SCIENCES REPORT OF THE PANEL ON EVALUATION AND TESTING NOVEMBER 1967/CUEBS Publication 20. Available for Download at no charge from: http://ofd.ncsu.edu/wordpress/wp-content/uploads/2013/09/testing-and-evaluation-in-the-biological-sciences.pdf
Characteristics that a satisfactory multiple-choice item should possess:
a. the stem sets forth a single precise unambiguous task for the student to do;
b. the stem is followed by a homogeneous set of responses, parallel in construction;
c. no response can be eliminated because of grammatical inconsistency with the stem;
d. the responses contain no verbal associations that provide irrelevant clues to the answer;
e. the correct response is not more elaborate in phraseology than the incorrect ones;
f. to the student who does not perceive the problem or know the answer, each response may appear to be a plausible answer.
Terminology - item/question, stem, correct answer, foils, distractors.

Writing multiple choice questions
Preparing better MCQs
More on this - detailed

Overview & Looking Forward

Overview - Item Analysis can point out questions which need work.

Immediate goal- better questions covering your course learning objectives

Longer range goals- Annotation of test items ⇒ allowing computerized analysis of student learning (especially important in large classes)

Q & A

 

Resources

A nice non-technical overview and justification for the use of Item Analysis. This is part of more general coverage of Item Analysis. A related topic is How to Write Tests. (These are temporary locations - probably to be moved soon.)

A description of Item Analysis and its use. Discussion on the Index of Discrimination (which gives the same type of info as does Point Biserial correlation)

Fall 2013 OFD workshop on using Bloom's Taxonomy of Cognitive Objectives on constructing test items at the various Cognitive Levels.

 

 


http://www.ncsu.edu/it/open_source/Item.html


Copyright 2013, 2014, 2017 by Henry E. Schaffer, Karen R. Young, Emily Wicker Ligon & Diane Chapman     Comments and suggestions are welcome, and should go to hes@ncsu.edu
Last modified 3/3/2017
Disclaimer - Information is provided for your use. No endorsement is implied.