Mon | Jan 19, 2026

Annual reports on CXC: More facts to consider

Published:Sunday | June 9, 2013 | 12:00 AM
In this June 2012 photograph, Teacher of the Year Dageanna Spencer-Hull, from Holland High in Trelawny, is caught in action as she teaches a sixth-form social studies class. Spencer-Hull is cre-dited with having 100 per cent passes annually for students she sends up for CSEC and CAPE in social studies, religious education and sociology. -PHOTO BY Barrington Flemming
Tripp Johnson
1
2

Tripp Johnson, GUEST COLUMNIST

The Gleaner published on June 2, 2013 a response by Dr Alfred Sangster to the annual CXC study conducted by Johnson Survey Research (JSR), in partnership with The Gleaner. The column offered a myriad of critical comments and suggestions, many quite valid and deserving of a response.

Dr Sangster's critical points are summarised as follows:

The education system cannot be judged on the performance of high schools alone.

The top 10 rankings obscure the excellence to be found in all categories.

English and maths are inadequate for the purposes of judging an educational system - there are many other curricular and extra-curricular dimensions to account for.

Higher-performing schools for CXC typically enrol students with higher GSAT scores.

Extenuating social circumstances in and around the life of students such as violence, financial insecurity, etc impinge on a student's ability to perform.

'Upgraded high schools' are still relatively new, and face administrative/academic challenges unknown to 'traditional high schools'.

In addition, Dr Sangster, president emeritus of the University of Technology, Jamaica, reiterated an argument that has been often expressed in the annually published report on CXC results - that in the interest of comprehensiveness and utility, the focal point of this study should be expanded beyond a singular analysis of the CXCs, to encompass a comparative examination of the respective GSAT scores for each cohort.

The clarity and understanding offered by that comparative analysis would be significant. We could make the manifold determination of how much 'value-added' each school supplies to its students, as well as be granted an understanding (albeit relative) of how much aid - financial or otherwise - these schools will require if they are to offer their students the educational and academic opportunities enjoyed by those attending the most prestigious secondary schools.

That being said, as a researcher, my ability to research a given topic is severely circumscribed by the availability of relevant data. In this case, such data would come from the Ministry of Education. Indeed, those data have been requested since the inception of these studies, but unfortunately has not been forthcoming. Therefore, one must work with what is accessible.

RATIONALE

The grade of F- was provisionally given to the educational system as a whole "on the basis of the CXC scores" for 2012. Secondary schools, while no doubt being but one link in the chain of education, contain the products of basic and primary schools. They also serve as the point of convergence between past educational history/academic performance, and either future education at a tertiary institution or employment in the workforce.

Failing the eventual acquisition of GSAT scores, the use of secondary schools for this study is the best option available. It should also be noted that beginning with the CXC results from 2011, Johnson Survey Research and The Gleaner have begun assessments of CXC scores for information technology, biology, chemistry, human and social biology, integrated science, and physics. Of course, such subjects don't even have close to the same enrolment rate as two foundational curricula such as English and math and as such, paint a picture in a different hue.

After all, the goal of our study is to depict the likely distribution of secondary-school graduates in Jamaica after they finish high school. There is burgeoning evidence that demonstrates not just the functionality of CXC results in this matter, but also the perception of this functionality for the students.

Remember, it is unlikely that a potential employer will have either the inclination, or the luxury of the time, to consider the plethora of extenuating circumstances that may or may not have crystallised around a student's poor performance in an academic setting. Many employers require between five and 10 (or more) CXC passes in order to even consider an applicant for an interview, let alone a job.

Bearing this in mind, it really can't be described as surprising that some individuals will go so far as to purchase forged CXC certificates. This, of course, is not to condone the fraud in any way. It is obvious that such behaviour both undermines and diminishes the effort put forth by students who did actually pass on their own merits.

Rather, the purchase of forged CXC certificates expresses the high symbolic value as a (quasi-) currency CXC passes hold. Coupling such symbolic value with the aforementioned material value CXCpasses have with respect to employment, makes clear the importance of these tests.

Other potentially relevant dimensions of a school, such as sports, location, social activity, etc, are unfortunately beyond the scope of this study and, in some cases, simply unquantifiable.

As Dr Sangster eloquently put it, 'Upgraded' schools face a multifold "Herculean task". However, future employers tend to turn a blind eye towards the travails of applicants, no matter how legitimate such a justification may be. In this way, but also with respect to the purpose and role of selectivity embodied in GSAT scores, the schism between students of (especially high performing) 'traditional high schools' and their counterparts in the rest of the student corpus continues to grow.

PURSUIT OF PERFECTION

It is while bearing such a process in mind that the development of our currently deployed measure of 'quality' developed. The intent was never to condemn schools or disempower students faced with extraordinary challenges. Rather, it was to try to depict trends in CXC results in order that we may more fully understand both how we arrived at this place, and what could be done to progress to a better one.

Such an ethos necessarily motivates a continuously evolving report. In his column, Dr Sangster proposed the use of a quartile grading system of A/B/C/Ungraded, which would navigate the differentiated challenges faced by some students and schools. It would also laud those students and institutions who received good CXC scores. His proposal was to take the quality score of the students in a given school and divide it by what would constitute a perfect score (4,000).

Those schools with a quality-score rank of 60 per cent or higher would receive a grade of 'A', those with a quality-score rank from 30 per cent to 59 per cent would receive a 'B', those with a quality-score rank of 15 to 29 per cent would receive a 'C'. Those schools with a quality-score rank of less than 15 per cent would be marked 'Ungraded'.

This is a good idea, however it is currently unclear how those numerical spans were determined. A modification of such a method would be to devise a separate quartile ranking system for each type of school (traditional, upgraded, or technical) on the basis of the previous three years of CXC quality scores, and to then utilise those ranges for determining a grade of A/B/C/UG.

This would compensate for some of the stark contrasts between conditions in traditional, upgraded, and technical schools, as emphasised by Dr Sangster. In addition to this, and in keeping with one of the initial objectives of this report, an aggregated quartile system derived from all schools might be developed.

The methodology utilised in The Gleaner/JSR annual report of CXC results is constantly evolving, and Dr Sangster's lucid comments are certainly welcome; they will receive the serious consideration they deserve.

Tripp Johnson is director of research at Johnson Survey Research. Email feedback to columns@gleanerjm.com and WTG.Tripp.Johnson@gmail.com.