What is EXAM?
EXAM is a general-purpose exam scoring system for scoring OPTICAL MARK SENSE answer sheets (specifically the GENERAL PURPOSE – NCS-ANSWER SHEET available through the Bookstore).

Features of Exam

 

  • Custom output – only those pages that are actually desired will be printed.
  • Score scaling – a variety of scaling methods are available and are easily chosen.
  • Correction for guessing – an amount may be subtracted for all wrong answers.
  • Individual student sheets – these sheets may be printed with only correct responses marked, only incorrect responses marked, both marked, or not printed at all.
  • Instructor listings – score lists for instructors may be printed in student name, score, and/or id order.
  • Multiple sections – many sections of a course taking the same exam can be graded at one time.
  • Reusable scans sheets – students scan sheets may be used again by using different question numbers. This can turn into hugh savings for departments.
  • Multiple key sheets – each question may have one to five correct responses.
  • Item analysis – proportions responding correctly, and percentages for each response are available for each question.

 

How Do I Use EXAM?
Exam may be used with as few as two scan sheets in addition to the student response sheets. One is the OPTIONS sheet and the other is the KEY sheet. (“Marking” as it is used below means placing the letter in a column and filling the associated bubble.)

The OPTIONS sheet is the first sheet of the stack and must have the INSTRUCTOR NAME marked in the NAME field. A “1” marked in SPECIAL CODES column L and either “A” or “B” in ITEM #1 of the ANSWERS section. An “A” in this item requests standard exam scoring with individual student sheets printed. A “B” in this item requests standard exam scoring but does not print individual student sheets. Other options may be marked in this area as well. See the following section for details. This sheet may be re-used each time an exam is scored. If more than on exam is submitted at one time each must have its own OPTIONS sheet with a unique INSTRUCTOR NAME (i.e. Precede the name with an “A” on the first exam, a “B” on the second exam, etc.).

The KEY sheet is the second sheet in the stack, and must have the correct response marked in the answer portion of the sheet for each item to be checked. It is a good idea to mark the name of the exam in the NAME field.

STUDENT sheets come next. The student should put his/her STUDENT NAME in the NAME field (with the last name first) and STUDENT ID in the ID NUMBER columns A – I. Responses to exam questions should be marked in the items in the ANSWER section.

scan

These are minimal requirements to have exams scored using EXAM. It is recommended that EXAM users start with these configurations. If customization is desired please refer to the following sections.

Exam stacks are accepted by the University Computing Services (fourth floor of the Drinko Library) 8:00 am – 4:30 pm, Monday through Friday. The service request form should specify exam scoring. A space is provided on the form if someone else is to pick up the output. Turnaround during non-peak times should be between 15 and 60 minutes.

Scan sheets, like all machine readable materials, should not be “Folded, Spindled, or Mutilated” or stapled, rolled, paper-clipped, or marked on except in specified places.

Customization Options

The easiest way to use EXAM is with a standard set-up. We currently have two with the difference between them being the printing of individual student sheets. We will defer comment each portion of the “standard” set-up to the appropriate section below. The default set up is (a). No other option needs to be selected unless you desire to change part of the standard set-up. Then only select the option you want altered.

1. Standard set-up:
a. Standard exam scoring with:
test score analysis printed
distribution chart of scaled score printed
test score listing by name printed single-spaced
“student sheets with correct & incorrect key items printed
item analysis printed
“percent correct” as scaling method
no “letter grade”
one possible correct response per item
responses printed as letter
no correction for guessing

b. Standard exam scoring with:
test score analysis printed
distribution chart of scaled score printed
test score listing by name printed single-spaced
“student sheets not printed
item analysis printed
“percent correct” as scaling method
no “letter grade”
one possible correct response per item
responses printed as letter
no correction for guessing

The test score analysis page gives statistics on the scores after EXAM has checked and graded each student’s response sheet. Test score analysis is divided into four sections. The SUMMARY section shows information about the scoring process (i.e. Number of student sheets, number that had at least one response bubble filled or no response bubbles filled, questions checked, etc.). The RAW-SCORE section shows descriptive statistics for the student’s raw scores. The raw score is the number correct minus any correction amount for guessing. The SCALED-SCORE section includes statistics for the students’ raw scores after they’ve been scaled according to the chosen scaling methods. A “FIRST-GLANCE” section at the top of the page gives important information on the scaled scores in an easy to find place. The three statistics sections only count sheets that had at least one response bubble filled. This prevents students who turn in a sheet but don’t answer any questions from skewing the score statistics. This page is printed unless specifically excluded by filling the (b) circle for response number 2 on the options sheet.

2. Test score analysis
a. Printed
b. Not printed

The test distribution chart graphically shows the distribution of scores as well as frequencies and percentiles of each score. By looking at the CUMULATIVE FREQUENCY column beside each score, you can see the number of students getting that score or higher. Respectively, the CUMULATIVE PERCENT (also called PERCENTILE) shows that number of students as percent of all students taking the percent getting a grade of B or C (by subtracting the percent getting an A from the percent getting a C or better). By default, only the distribution of the scaled scores is printed.

3. Test score distribution chart
a. Distribution of scaled score only printed
b. Distribution of raw score only printed
c. Distribution of scaled score and raw score printed
d. Neither printed

Listing of all students taking the test are available in three orders. The listings all show scaled score, raw score, number correct, number incorrect, number of no responses, number of multiple responses, and a grade if “Grading breakpoints: is chosen. The “no response” and “multi response” fields are handy to check if a sheet might have been read incorrectly by the scanner either from light mark or incomplete erasures. The “listing by name” and “listing by score” also include the students’ names and student ID numbers. For score posting purposes the listing by student ID” does not list the students’ names. If multiple course sections are being graded at the same time, a separate sheet will be printed for each instructor/course/section. By default, a listing in student name order is printed single-spaced.

4. Test Score listing by student name
a. Printed single-spaced
b. Printed double-spaced
c. Not printed
5. Test Score listing by score
a. Printed single-spaced
b. Printed double-spaced
c. Not printed
6. Test Score listing by student ID
a. Printed single-spaced
b. Printed double-spaced
c. Not printed

Individual student response sheets can be printed so that each student can have a record of questions issued and/or correct. Much of the same information is available on these sheets as is available on the test score listings. Aware that the instructor may not want correct answers to exam questions made public, we have allowed for the any combination of correct and incorrect answers to printed on these sheets. If multiple course sections are being graded at the same time then the sheets are printed already collated in instructor/course/section order. The default for this option depends upon the standard set-up chosen.

7. Individual student response sheets
a. Printed with correct & incorrect key items
b. Printed with only correct key items
c. Printed with only incorrect key items
d. Not printed

The item analysis section gives a detailed look at each question of the exam. For each item, the key answers, percent of respondents answering each choice, proportion that answered the question correctly, and the proportion of each quartile answering the question correctly. The item analysis can be used as an aid in determining a question’s or alternative’s effectiveness. If individual student sheets are not printed, this is the only way to get a listing of the key used to score the exam. The item analysis is normally printed.

8. Item analysis
a. Printed
b. Not printed

The scaling method determines how the raw score becomes the scaled score. Currently there are four methods: a. dividing the raw score by the number of questions. This is designated by “% Correct” on EXAM output (a misnomer sometimes since this is only so if the raw score is simply the number correct). b. letting the raw score be the scaled score, c. adjusting the raw scores such that they fit a “standard” distribution with a mean of 50 and standard deviation of 10 (called a t score distribution), and a t score distribution modified so the mean is 75 (allowing a 90-80-70-60 grading scale to be used). Standard set-ups use the “% Correct” method (since no correction is subtracted).

9. Scaling method
a. raw score as a percentage of number of questions
b. raw score
c. t score distribution (mean =50, std dev = 10)
d. modified t score distribution (mean = 75, std dev = 10)

Grade breakpoints allow EXAM to equate letter grades to the scaled score for printing purposes. The default is to not equate to a letter grade.

10. Grading breakpoints
a. Do not equate score to letter grade
b. A=90-100, B= 80-89, C=70-79, D=60-69, F=0-59

EXAM is set up to handle multiple correct responses for exam items. This does not allow a “Choose all of the above that are correct” type of question, but rather gives the instructor the opportunity to let “either A or C” to be correct. This makes it unnecessary to throw an entire question out of two alternatives end up being appropriate. It can also be used to give everyone credit for a question (leaving it blank on the key drops the question from the key…i.e. turns a 50 question exam into a 49 question test). The standard is to use only one key sheet.

11. Number of key sheets
a. 1
b. 2
c. 3
d. 4
e. 5

The GENERAL PURPOSE NCS ANSWER SHEET has responses marked as both letters and numbers. EXAM normally uses letters but allows numbers as an option.

12. Responses
a. Printed as letters
b. Printed as numbers

In order to nullify the possibility of guessing correct answers, a correction amount for each wrong response (but not no responses) can be subtracted from the number correct to get the raw score. This raw score is then scaled using the selected scaling method. The amount to be subtracted depends on the number of alternatives for each exam question, and should be 1/ (N-1) where N is the number of alternatives. The same amount will be used for all questions, so question type cannot be varied (i.e. four alternative and five alternative questions on the same exam). Standard set-ups do not subtract a correction for wrong answers.

13. Correct for guessing
a. Do not subtract correction amount for guessing
b. 1 per wrong answer
c. 1/2 (.50) per wrong answer
d. 1/3 (.33) per wrong answer
e. 1/4 (.25) per wrong answer