Thu | Dec 8, 2022

UPDATED | New result format for PEP students

Published:Wednesday | June 12, 2019 | 12:00 AMPaul Clarke/Gleaner Writer
Dr Grace McLean

A week before the release of results from the first sitting of the Primary Exit Profile (PEP) examinations, several education ministry officials are struggling to understand and explain the new format in which the students’ performances are being assessed and presented.

At a press conference yesterday, Karl Samuda, the minister overseeing the education portfolio since Ruel Reid’s departure, said students will be provided with a detailed report on their performance with a scaled score, a shift from the percentage-based system used in PEP’s predecessor, the Grade Six Achievement Test (GSAT).

“We have moved away from reporting raw scores. It will allow for a more accurate reporting of a student’s performance. Similar to CSEC (Caribbean Secondary Education Certificate) and CAPE (Caribbean Advanced Proficiency Examinations), you will not see the percentages. Instead, parents will receive a detailed four-page report which will clearly outline your [child’s] performance,” said Samuda.

However, CSEC and CAPE have well-established grading protocols – with I to III in CSEC and I to V in CAPE as metrics of proficiency.

Samuda said that the shift in reporting test results is in keeping with regional and international best practices, and that the new reporting system will help schools and teachers craft educational plans designed specifically for students.

“We are confident that this is a better system which will benefit our students,” Samuda said.

However, what is not clear is how the scale scoring works.

As The Gleaner sought to get clarification on the new system, several senior ministry officials could shed no light on the matter, saying they have not fully understood the system themselves.

Stacy Witter, acting senior education officer in the ministry’s Test Development Section, sought to explain.

“What this is,” she noted, “especially when you have various subjects testing, for example, all the total marks on each subject are different. Because of this, we say it sits on different scales.

“So if you can imagine 80 per cent for a test that’s out of a hundred is different than one that’s out of 80. So for an overall picture and for comparison, it’s best to use scale scores. Ultimately, all it means is that we are converting the raw scores to sit on the same scale and it is this that gets you the scale score,” she said.

She said this method provides more consistency and better accuracy in facilitating a direct comparison of students’ performance across subject areas.

Delayed explanation

However, despite the explanation, the ministry will not give a detailed outline of the scale until two weeks after next week’s release of the results. 

Permanent Secretary Dr Grace McLean said it is important that parents, teachers and the wider society understand the marking process, assignment of scores, and how they affect placement.

“We should note that data capture and scoring of all selected responses are done by computers. The performance tasks are marked by specially selected and trained teachers,” said McLean.

She noted, too, that as a safeguard, the marking process is anonymous and is done in a sterile environment at a central location.

“I want also emphasise that there is no change to the approach to be used for the placement of students who sat the PEP this year as against the GSAT of previous years,” noted McLean.

A total of 41,617 grade six students sat the test. Some 1,219 chose not sit the test for various reasons, Samuda said.


(EDITOR'S NOTE: A previous version of this article stated that, without a detailed outline of the scale, parents might not be able to readily tell the proficiency of their children. However, parents should be able to determine some level of proficiency from the results as published next week.)