Item Analysis and Test Standardization

Item Analysis is an important (probably the most important) tool to increase test effectiveness. Each items contribution is analyzed and assessed.

To write effective items, it is necessary to examine whether they are measuring the fact, idea, or concept for which they were intended. This is done by studying the student’s responses to each item. When formalized, the procedure is called “item analysis”. It is a scientific way of improving the quality of tests and test items in an item bank.

An item analysis provides three kinds of important information about the quality of test items.

  • Item difficulty: A measure of whether an item was too easy or too hard.
  • Item discrimination: A measure of whether an item discriminated between students who knew the material well and students who did not.
  • Effectiveness of alternatives: Determination of whether distractors (incorrect but plausible answers) tend to be marked by the less able students and not by the more able students.

Item difficulty, item discrimination and the effectiveness of distractors on a multiple-choice test are automatically available with ParScore’s item analysis. An illustration of ParScore’s “Standard Item Analysis Report” printout is attached.

GENERAL GUIDELINES FOR ITEM WRITING:

Writing item is a matter of precision. It is perhaps more like computer programming than writing a prose. The task of the item writer is to focus the attention of a large group of examinees, varying in background experience, environmental exposure and ability level on a single idea. Such a situation requires extreme care in choice of words. The item writer must keep in view some general guidelines which are essential for writing good items. These are listed as under;

CLARITY OF THE ITEM:

Clarity in writing test item is one of the main requirement for an item to be considered good. Items must not be written as “verbal puzzles”. They must be able to discriminate between those who are competent and those who are not. This is possible only when the items have been written in a simple and clear language. The items must not be test of examinee’s ability to understand the language. The item writer should be very cautious particularly in writing the objective items because each such item provides more or less an isolated bit of knowledge and there the problem of clarity is more serious. If the objective item is a vague one, it will create difficulty in understanding and the validity of item will be adversely affected. Vagueness in writing items may be because of several reasons such as poor thinking and incompetence of the item writer.

NON-FUNCTIONAL WORDS SHOULD BE AVOIDED:

Non-functional words must not be included in the items as they tend to lower the validity of the item. Non-functional words refer to those words which make no contribution towards the appropriate and correct choice of a response by the examinees. Such words are often included by the item writer in an attempt to make the correct answer less obvious or to provide a good distractor.

AVOID IRRELEVENT ACCURACIES:

The item writer must make sure that irrelevant accuracies unintentionally incorporated in the items, are avoided. Such irrelevant accuracies reflect th poor critical ability to think on the part of the item writer. They may also lead the examinees to think that the statement is true.

DIFFICULTY LEVEL SHOULD BE ADAPTIBLE:

The item must not be too easy or too difficult for the examinees. The level of difficulty of the item should be adoptable to the level of understanding of the examinees. Although it is a fact that exact decision regarding the difficulty value of an item can be taken only after some statistical techniques have been employed, yet an experienced item writer is capable of controlling the difficulty value beforehand and making it adoptable to the examinees.

In certain forms of objective type items such as multiple choice-items and matching items, it is very easy to increase or decrease the difficulty value of the item. In general, when the response alternatives are made homogenous, the difficulty value of the item is increased but when the response alternatives are made heterogeneous, except the correct alternative, the examinee is likely to choose the correct answer soon and thus, the level of difficulty is decreased. The item writer must keep in view the characteristics of both the ideal examinees as well as the typical examinees. If he keeps the typical examinees ( who are fewer in number) in view and ignore the ideal examinees, the test items are likely to be unreasonably difficult ones.

STEREOTYPED WORDS SHOULD BE AVOIDED:

Use of stereotyped words either in the stem or in the alternative responses must be avoided because these facilitate rote learners in guessing the correct answer. Moreover, such stereotyped words failed to discriminate between those who really know and understand the subject and those who do not. Thus, stereotyped words do not provide an adequate and discriminatory measure of index. The most obvious way of getting rid of search such word is to paraphrase the words in a different manner so that those who really know the answer can pick up the meaning.

IRRELEVANT CLUES MUST BE AVOIDED:

Irrelevant clues must be avoided. These are sometimes provided in several forms such as clang association, verbal association, length of the answer, keeping a different foil among homogenous foils, giving the same order of the correct answer, etc. In general, such clues tend to decrease the difficulty level of the item because they provide an easy route to the correct answer. The common observation is that the examinees who don not now the correct answer, choose any of these irrelevant clues and answer on that basis. The item writer must therefore, take special care to avoid such irrelevant clues. Specific determiners like never, always, all, none must also be avoide because they are also irrelevant clues to the correct answer, especially in the two-alternative items.

INTERLOCKING ITEMS MUST BE AVOIDED:

Interlocking items must be avoided. Interlocking items, also known as interdependent items, are items that can be answered only by referring to other items. In other words, when responding correctly to an item is dependent upon the correct response of any other item, the item constitutes an example of an interlocking or independent item. For example:

  • Sociometry is a technique use to study the affect structure of groups. True/false
  • It is a kind of projective technique. True/false
  • It was developed by morene et al. true/false

The above examples illustrate the interlocking items. Answer to items 2 and 3 only be given when the examinee knows the correct answer of item 1. Such items should be avoided because they do not provide and equal chance examinees to answer the item.

One thought on “Item Analysis and Test Standardization

Leave a Reply

error: Content is protected !!