Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contacts Login 
Home Print this page Email this page
Users Online:: 19059

 Table of Contents  
ORIGINAL ARTICLE
Year : 2021  |  Volume : 24  |  Issue : 4  |  Page : 172-177

The use of excel spread sheet and printer in marking medical science objective questions: An alternative to manual marking method


1 Department of Anatomical Sciences, Faculty of Basic Medical Sciences, College of Health Sciences, University of Abuja, Abuja, Nigeria
2 Department of Human Physiology, Faculty of Basic Medical Sciences, College of Health Sciences, University of Abuja, Abuja, Nigeria

Date of Submission01-Jun-2020
Date of Decision29-Aug-2020
Date of Acceptance31-Aug-2020
Date of Web Publication11-Feb-2022

Correspondence Address:
Dr. Raymond Ugochukwu Okolo
Department of Anatomical Sciences, Faculty of Basic Medical Sciences, College of Health Sciences, University of Abuja, Abuja
Nigeria
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/smj.smj_61_20

Rights and Permissions
  Abstract 


Background: Multiple-choice question (MCQ) is an effective form of written assessments for knowledge acquisition. Optical mark reading scanner is used for grading MCQs. Major setback in the use of scanner is the cost of acquiring it. Consequently, several manual methods such as conference making and stencil overlay are used to mark MCQ. This paper highlights the use of Microsoft Excel Spreadsheet and printer to mark MCQ and compares the method with other manual methods of marking. Materials and Methods: Five sets of 100 stems of objective questions (true or false, one in five, and matching types) were administered to students. Answer script was designed using Excel Spreadsheet. Three groups of marking were done: directly on the question paper, use of stencil for marking on the answer script, use of Microsoft Excel soft copy. At the end of the marking process, mean marking time, mean collation time, mean counting time, and mean marking error were collated and compared. The instant computer statistical software was used to calculate means, standard deviations, and ANOVA.A P < 0.05 was regarded as statistically significant. Results: Results showed a statistically significant low mean marking time of 0.68 h (P < 0.0001) for the spreadsheet/printer marking method. Marking error was significantly low with this method (P < 0.001) as compared with other manual marking methods. The collation time and counting time were not significantly different across the groups. Conclusion: It is concluded that Excel Spreadsheet and printer marking method significantly reduced marking time and marking errors in marking MCQs.

Keywords: Excel spreadsheet, manual marking, multiple-choice questions, printer


How to cite this article:
Okolo RU, Kolawole OV. The use of excel spread sheet and printer in marking medical science objective questions: An alternative to manual marking method. Sahel Med J 2021;24:172-7

How to cite this URL:
Okolo RU, Kolawole OV. The use of excel spread sheet and printer in marking medical science objective questions: An alternative to manual marking method. Sahel Med J [serial online] 2021 [cited 2024 Mar 29];24:172-7. Available from: https://www.smjonline.org/text.asp?2021/24/4/172/337491




  Introduction Top


Assessments are fundamental in the process of knowledge acquisition.[1] Forms of assessments include (1) oral assessments, where candidates respond orally to series of questions; (2) practical assessments, which assess skills acquired over a period of training; and (3) written assessment, in which candidates respond in writing to questions. Written assessments could be in the form of long essay, short essay, short answers, and objective questions. Objective questions are questions whose answers are predetermined. Such answers could be supplied by the candidate, or derived from a given set of choices (multiple-choice question [MCQ], true or false, matching), or through the identification of objects or position on graphs, pictures, or maps.[2] As against the scoring of essay questions which are largely subjective, despite the use of marking schemes, the MCQs have only one answer. This makes their scoring objective.[3]

MCQs are a form of objective questions. An MCQ comprises a “stem,” usually the question to which a number of answers or options (4 or 5) are provided. These options or answers are made up of one option which best answers the question. This very option is referred to as the key. The other options are referred to as distracters.[4] The “true or false” and the “matching” types of objective questions technically are MCQs because, in the “true” or “false” type, two options, “true” or “false,” are provided to choose from and, in the “matching” type, several options are provided to choose from. A major drawback in the “true or false” type is its inability to discriminate between high- and low-ability candidates.[5]

MCQs are effective assessment tools for testing factual information.[6] However, they can be employed to test all forms of factual and cognitive skills, as well as essays, if well constructed.[7],[8],[9]

One of the advantages of the use of MCQs as good assessment tools lies in their use to assess large number of students. They facilitate rapid generation of student performance and competence through detailed analysis of the multiple-choice items.[10],[11],[12] This is done by the use of automated Optical Mark Reading (OMR) scanner, generally referred to as test scoring machine. The test scoring machine uses specially designed machine-readable answer sheets on which students enter their responses. These are fed into the OMR scanner, which uses specially written software for scoring and analysis of the multiple-choice item. These test scoring machines provide students' total scores, group statistics, distribution of students' scores, analysis of items on test, overall structure of the questions, and the options (key and distracters).[12]

With the test scoring machine, rapid feedback and analysis of how well the test has discriminated among students can be performed.[12] However, the major drawbacks to these machines are that they are expensive to acquire and require specially designed machine-readable answer sheet consumable. This may not be within the reach of departments and institutions in resource-limited countries, and so, the demand of regular formative assessments and rapid provision of results of both formative and summative assessments, as required in a medical school, may not be realizable.

Nevertheless, departments still adopt the use of MCQs but resort to manual methods of grading students' performance. Such manual methods include dictating the correct responses, while a good number of staff engages in the marking of the scripts. Alternatively, the marking scheme is provided on stencils which are over laid on students' scripts to facilitate marking of the scripts. A second stencil may be provided to cumulate students' wrong responses. This is particularly so in the true or false type of MCQs where penalty for wrong choices is implemented. This manual process is time-consuming, tiresome, and error laden, thereby marring the objectivity of the MCQ assessment tool.

Ramesh et al.[13] proposed a marking method using the Microsoft Excel Spread Sheet for a moderate number of candidates (less than 150) and for about 25 MCQs. This method involved the use of a special software program. They proposed the use of computer for more than 700 candidates, using more than 75 stem MCQs. They suggested that the choice of any marking method should be determined by the number of candidates, the number of questions, the duration of the test, and how fast the result is required. Further, the relative cost of each method of marking should be taken into consideration.

Catalan[14] proposed a marking method using a less sophisticated OMRs and spread sheet to process the students' answer scripts. This process has a high degree of accuracy, slower, and less expensive than the more sophisticated OMR machines but still could be quite expensive for resource-limited units.

The increasing number of students admitted to various science and other related courses demand the use of MCQs for assessment. Using MCQs for such assessments puts a lot of demand on lean available resources and work force, if regular student's evaluation and feedback are to be undertaken. The aim of this paper is to compare several manual methods of marking MCQs with the use of excel spread sheet and a printer method of marking.


  Materials and Methods Top


This is a retrospective study comparing several manual methods of marking MCQs with the use of excel spread sheet and printer method.

Material used for the study

The MCQ anatomy assessments' answer sheets of the students of Anatomical Sciences Department, Faculty of Basic Medical Sciences, College of Health Sciences, University of Abuja, were used to compare the traditional manual methods of marking MCQ with the method of using Microsoft Excel Spreadsheet and printer. The MCQ examinations contained 250 items (options numbered instead of the stems): “True or False” type, 150; “1 in 5” type, 50; and 50 matching type of MCQs.

On the question paper, the students were instructed to enter their responses in the appropriate true or false column, or in the appropriate column, a, b, c, d, or e, in the “1 in 5” and the matching questions.

The MCQ answer script was designed to be marked using the printer.

Design of the soft copy of the answer script using Microsoft Excel Spreadsheet on the computer

Provision was made for entry of the identity of the examination (Department, Name of the Examination, Date, Location, and Identity of Student). Columns of cells were created for the serial number of the MCQs and for candidate entry of options. Columns for entry of options depend on the number of option to choose from in the MCQs (vide infra).

Answer script for “True” or “False” type of multiple-choice questions

This design is based on numbering the options of the MCQs instead of the stem.

Two columns of cells for candidates' entry of options were provided. Each cell in the columns for entry of options measured as follows; row height – 27 pixels (20.25 points) and column width – 44 pixels (5.57 points).

Six groups of three columns were accommodated on an A4 paper size. The columns were continuous. Adequate numbers of answer scripts were printed out for the assessment. This design was saved as MCQ Answer Script.

On the question paper, students were instructed to enter their response to every question against the number of such question, in the appropriate box on the answer script. If the student's response to a particular question (Question 4) is false, the student entered “X' in the box against the number of that particular question (Question 4), under the “FALSE” column. The “X” should as much as possible cross diagonally in the box provided [Figure 1].
Figure 1: Part of the Answer Script. Observe the following: The top part of the answer script contains information about the examination and the student's identity. The arms of the “X” were diagonally across the cell

Click here to view


The answer script had three pages with each page on a separate sheet with a provision for the students to enter their admission number on each page.

Preparation of the marking template

Exact copy of the MCQ answer script was made on another excel sheet and named MCQ Marking Sheet. The correct answers (Key) were entered in the MCQ marking sheet with capital “O,” with the computer in font size 16 BOLD. The upper part of the MCQ Marking Sheet, containing information about the examination and the student's identity, was highlighted and deleted. The serial numbers were highlighted and deleted. All entries on the MCQ Marking Sheet apart from the “O” were highlighted and deleted. The borders of the cells are removed. [Figure 2] depicts MCQ marking template.
Figure 2: Marking template. Note that only the entered correct responses (Key) appear on a plain sheet. All the borders and grid lines are removed. The entered responses (Key) “O” are in font size 16

Click here to view


Answer script for “1 in 5” and “matching” types of multiple-choice questions

The MCQ answer script for the 1 in 5 and matching types of MCQs followed similar pattern. Against the serial number of a particular question, five boxes are created, labeled a, b, c, d, and e. The boxes were continuous, i.e., in columns [Figure 3].
Figure 3: Multiple-choice questionsAnswer Script for the “1 in 5” and matching types of multiple-choice questions. The instructions for entry of responses were similar to those of “True or False” type, i.e., the entry of response was with an “X” in the appropriate box. Suppose the response to question “6” is option “c,” “X” is inserted in box “c” against question number “6”

Click here to view


The preparation of the Marking Scheme or Template follows similar process as that of the “True or False” type [Figure 2].

Marking of the students' multiple-choice question answer script

After the examination which lasted for 3 h, triplicates of the answer scripts were made.

The original copy was graded and the results were used for the assessment examination. The three other copies were used for the study. Each set was serially numbered. The students' answer scripts were properly arranged. Similar pages were collected and arranged together.

A total of nine good students (who scored above 65% at previous assessments) were recruited for the study and grouped into three groups (A, B, and C) of three each.

Group A was presented with answer scripts of 50 students and the key to the MCQ. The group proceeded to mark the answer scripts directly. One member of the group dictated the responses while the other two members marked the scripts. The researchers timed the process of marking.

Group B was presented with similar materials and in addition two blank hard copies of the answer script. This group cut stencil from the blank answer script such that the boxes for the correct responses were cut open so that, when over laid over the students' scripts, the student's correct responses would appear. On the second blank answer scripts, boxes for the wrong responses were cut open so that, when overlaid on the student's answer script, the wrong responses made by the students to each question would appear. This was done only in the section with the “True or False” of MCQ. This was used to effect penalty in the “True or False” section. The marking was then carried out by overlaying the stencil on each student's answer script and ticking the correct responses. The wrong responses were also ticked and the scores recorded.

Group C was presented with similar materials as Group A and in addition a computer with the soft copy of the answer script and a printer HP 1005; Serial No. VNCAL 61406; Date of Manufactur, February 2008; Product No. CB410A, with a printer speed of 15 pages per minute.

This group entered the MCQ key in the appropriate boxes on the Microsoft Excel soft copy of the answer script and prepared the Marking Template as described above [Figure 2].

The students' answer scripts were then fed into the printer and the marking scheme printed on the students' answer scripts. The printer printed “O” in the boxes for the correct responses for every question [Figure 4].
Figure 4: Student's script with the printed key/marking scheme. The candidate scores points where the “O” and “X” coincide. Observe that responses to questions 4, 7, 33, etc., in the “true or false” section do not coincide with the “O.” Consequently are wrong responses and should attract penalty if applicable

Click here to view


Collation

At the points where the “O” prints on the student's response “X,” the candidate scored the requisite point. The candidate was penalized at the points where the candidate's response “X” fell outside the correct response “O” (”True or False Section”).

Having used the printer to print the correct responses (Key) on the students' answer scripts, the correct and wrong responses were counted, the requisite credits and sanctions were applied, and the scores for the test were recorded. The process was timed.

Each group collated their marking by counting the total number of correct responses. The wrong responses in the true or false section were also counted, and penalty was deducted from the total score of correct responses. This process of collation was also timed.

Five sets of mock professional MCQs were similarly treated. The groups were also rotated so that each group had the opportunity of using a different method of marking. The average periods for each group were recorded.

All the groups carried out the exercise in a comfortable air-conditioned room, with adjoining conveniences and sufficient refreshments.

At the end of marking and collation, the marks were entered into a grade sheet and compared across the groups. Any discrepancies were verified and the scripts fished out to trace the error.

Two sources of errors were identified. They were marking errors and counting errors. Total marking errors and counting errors were recorded.

Statistical analysis

All data collected were statistically analyzed. The instat Computer Statistical Software (Informer Technologies, Inc. www.informer.com.) was used to calculate the means ± standard deviaton and ANOVA for significance. P < 0.05 was regarded as significant.


  Results Top


The results of the study are shown in [Table 1] below. The mean marking time was 0.68 ± 0.3 h at a significant level of P = 0.0001, with a zero marking error which was very significant (P = 0.001) when compared to the other manual marking methods in this study.
Table 1: The result of comparison of two manual methods of marking multiple-choice question with that of using Microsoft Excel Spread Sheet and a printer

Click here to view



  Discussion Top


In this study, a mean marking time of 0.68 ± 0.3 h was used to mark 50 MCQ scripts, using the printer and Microsoft Excel Spread Sheet. This included the time spent on preparing the soft copy of the marking template. When compared with the mean marking time for direct marking 6.64 ± 0.4 h and stencil marking of 4.52 ± 0.3 h, the mean marking time with the printer and excel spread sheet was significantly small (P = 0.0001). The mean marking time could be further reduced with faster printers.

The mean collation time and mean counting time were not significantly different with any of the marking methods. However, collation was easy with the printer marked script and could be carried out by any member of staff other than the lecturer, further lessening the load of work on the lecturer.

However, the mean marking error was significantly lowest (P = 0.001) for the printer/Microsoft Excel Spread Sheet marking method. Every marked script had the marking scheme printed on it, making it easy to verify any script for errors, unlike what obtains with the direct and stencil marking methods. The significant marking error of the direct and stencil marking methods is of great concern since the errors are usually very difficult to verify, because, quite often, the marking template is not very readily available. The printer marking method with negligible marking error becomes a tool that should replace the other manual marking methods.

However, for the comparison of results of marked scripts across the groups, the significant marking errors with the direct and stencil marking methods could not have been discovered. The only time such discoveries are made would be when candidates raise issues related to their scores in an assessment. Even in such circumstances, verification is laborious because it will entail re-marking the entire script. The amount of undetected and unverified errors committed with direct and stencil manual marking methods constitute enormous concern to the objectivity of MCQs. In the case of printer marked scripts, the marking scheme is printed on the answer script and verification will be by mere inspection. Quite often, if errors occur, they are due to counting not marking process.

This method does not require the use of any special software program and consumables, compared to that proposed by Catalan[14] and Ramesh et al.[13] and could be adopted for assessment of 300 candidates and above provided there is enough work force to do the collation.

In terms of cost of materials, the direct marking method does not require the printing of question answer sheets for each student, separate from the question paper. However, the correct keys to the questions must be printed. In the stencil and excel methods, question answer sheets for each student must be provided. The cost for production of the question answer sheet in terms of paper, ink, and storage space for the extra papers is negligible when compared to the advantages of zero making error and time saved.

During printing, alignment error could occur, arising from poor arrangement of the papers when feeding into the printer, or in the picking of the paper by the printer for marking. When this occurs, there will be malalignment of the printed answers and consequent overlapping of printed correct responses. In such circumstances, the answer sheet (s) affected should be retrieved and the correct response symbol changed from “O” to “*” (asterisk) or any other symbol to differentiate it from the symbol used for marking and fed back into the printer for a remark.

Since the printer marking method is less laborious, many more formative MCQ assessments could be carried out during the period of the study. This will motivate students learning, setting of higher learning goals by students and provide a good feedback and program planning to drive learning. The end result will be building competency.


  Conclusion Top


Marking MCQ assessments with the printer and Microsoft Excel Spread Sheet provides a time-saving veritable alternative to the direct and stencil manual marking method for the departments in scarce resource areas which cannot afford the OMR machine. It is convenient, less laborious, with minimal errors, and easily verifiable for errors, when compared to direct and stencil marking methods. In institutions where the OMR is not available, or where it develops a fault (technical, mechanical, depletion of consumables) as it usually does, the method of using a printer and excel spread sheet becomes very handy.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
William D. What is assessment for learning? Stud Educ Eval 2011;37:3-14.  Back to cited text no. 1
    
2.
McKenna C, Bull J. Designing Effective Objective Test Questions: An Introductory Workshop. Computer Assisted Assessment Centre, Loughborough University: Loughborough University, London; 1999. p. 1-15.  Back to cited text no. 2
    
3.
Bush M. Alternative Marking Schemes for Online Multiple Choice Tests; 1999. p. 1-4. Available from: http://caacentre.lboro.ac.uk/dldocs/BUSHMARK.pdf. [Last accessed on 2010 Sep 05].  Back to cited text no. 3
    
4.
Clegg VL, Cashin WE. Improving Multiple-Choice Tests. Idea Paper No. 16 [Manhattan, Kan.]. Center for Faculty Evaluation and Development, Kansas State University, Manhattan; 1986. p. 1-3.  Back to cited text no. 4
    
5.
McCoubrie P. Improving the fairness of multiple-choice questions: A literature review. Med Teach 2004;26:709-12.  Back to cited text no. 5
    
6.
Wood EJ. What are extended matching sets questions. BEE-j 2003;1:1-2.  Back to cited text no. 6
    
7.
Fuhrman M. Developing good multiple-choice tests and test questions. J Geosci Educ 1996;44:379-84.  Back to cited text no. 7
    
8.
Carneson J, Delpierre G, Masters K. Designing and Managing Multiple Choice Questions. Australia: Southrock Corporation Ltd; 2003. p. 1-7.  Back to cited text no. 8
    
9.
Harper R. Multiple choice questions – A reprieve. BEE-j 2003;2:2-6.  Back to cited text no. 9
    
10.
Ory JC, Ryan KE. Tips for Improving Testing and Grading. Vol. 4. Newbury Park: Sage Publications; 1993. p. 141.  Back to cited text no. 10
    
11.
Edward M. Multiple Choice Questions: Their Value as an Assessment Tool. 6. Vol. 14. Philadelphia Pennsylvania.:Lippincott Williams & Wilkins, Inc.; 2001. p. 661-6.  Back to cited text no. 11
    
12.
Hammond EJ, McIndoe AK, Sansome AJ, Spargo PM. Multiple-choice examinations: ADOPTING an evidence-based approach to exam technique. Anaesthesia 1998;53:1105-8.  Back to cited text no. 12
    
13.
Ramesh S, Manjit Sidhu S, Watugala GK. Exploring the potential of multiple choice questions in computer-based assessment of student learning. Malays Online J Instruct Technol 2005;2:1-15.  Back to cited text no. 13
    
14.
Catalan JA. A Framework for Automated Multiple-Choice Exam Scoring with Digital Image and Assorted Processing Using Readily Available Software. In DLSU Research Congress 2017. Manila: De La Salle University; 2017. p. 1-5.  Back to cited text no. 14
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4]
 
 
    Tables

  [Table 1]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
   Abstract
  Introduction
   Materials and Me...
  Results
  Discussion
  Conclusion
   References
   Article Figures
   Article Tables

 Article Access Statistics
    Viewed2486    
    Printed210    
    Emailed0    
    PDF Downloaded170    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]