Wednesday, October 21, 2009

Online Assessments

Introduction:
Today all learning content management systems offer tools for online assessments. Online assessments included in LCM-software are typically various forms of MCQ quizzes. These are marked by the software application minimising issues such as assessor fatigue and bias thereby strengthening the assessment principals of fairness, reliability, validity and practicality (McMillan 2001:20 & SAQA 2001:16-19).

Some of the quizzes offer essay questions, these are marked by a human facilitator (Moodle & BlackBoard).

The example of a LCMS assessment below shows different types of questions, one an MCQ question with more than one correct choice (square boxes) and the other a question with only one correct selection (radio button). The example is from the Basic Anaesthesiology module in the medical curriculum at the University of the Free State.


In the question with multiple selections, students must select all the correct answers to receive a complete mark. Selecting all boxes will award zero marks. The second question (radio buttons) only has one correct choice. Various forms of feedback are possible with the quizzes and as seen here the feedback strategy is that of formative constructive reference to the study guide. The online quizzes in the LCM system is easily set and students are positive about these assessments. Benefits of the LCM system assessments are that depending on the size of question bank assessments may be randomised for each students and different assessments. One can uses the question bank to place formative assessments in preparation for summative assessments from the same question bank that affords greater fairness in assessments and prepares students on the same structure for formative and summative assessments. Reliability of LCMS assessments are very high since marking are done by the software.

A negative regarding LCMS-based assessments are in terms of computer resources. The School of Medicine at the UFS are fortunate to have a computer room with 70 computers dedicated to the School programmes where online assessments may be administrated. This leeds to greater practicality of LCMS assessments in the School.

Other possibilities for online assessments exits and may be more innovative than LCMS-based online assessments but usually it requires more skills and time from the facilitators.

At the School of Medicine, UFS the module on general skills uses an electronic portfolio to assess general skills development in first year students. Skills assessment through a portfolio has definite challenges; time and feedback are two of the biggest challenges (Driessen, Overeem, van Tartwijk, van der Vleuten1 & Muijtjens).

The module on general skills overcomes these challenges by scoring a portion of the portfolio indirectly. Students are required to complete assignments in other modules than the skills development module. The assessment criteria for these assignments include a rubric for general skills associated with the specific assignment. The marks obtained in the assignments are included in the portfolio scoring system. Students must maintain their portfolio actively, a rubric is used to score the students maintenance of the portfolio. The rubrics used to score the portfolio ensure fairness and realibility, while assignments on authentic tasks ensure validity of the assessments. The decentralisation of assignment marking ensures that the portfolio as assessment tool stays practical in the skills development module.

The facilitator’s view of the 2009 portfolio may be seen by following this link.