CXC urges parents, teachers to guide ethical AI use as students embrace tech
Dr Eduardo Ali, pro-registrar and deputy CEO of the Caribbean Examinations Council (CXC), is urging stronger involvement from parents and teachers as students report increased use of artificial intelligence (AI) to support their studies.
AI is a branch of computer science that enables machines to simulate human cognitive functions such as learning, reasoning, problem-solving, perception, and decision-making.
CXC currently permits the use of AI in School-Based Assessments (SBAs) for its external examinations under a new policy framework that emphasises ethical use, proper referencing, and academic integrity. The framework allows AI to be used for ideation and enhancement, but not for generating full submissions. Strict penalties are outlined for misuse or cheating. The policy applies to SBAs, projects, and assignments and is designed to help students leverage technology while preserving original thought.
The policy governs assessments under the Caribbean Secondary Education Certificate (CSEC), Caribbean Advanced Proficiency Examination (CAPE), and the Caribbean Certificate of Secondary Level Competence (CCSLC).
SBAs are a flexible, continuous, and school-administered evaluation process in which teachers assess students throughout the learning cycle rather than relying solely on final examinations.
Ali made the remarks during Thursday’s panel discussion at the launch of the 2025 Teens and Technology Report, a collaborative initiative between the University of Technology, Jamaica (UTech) and the Broadcasting Commission of Jamaica.
While acknowledging the growing importance of AI, Ali emphasised the need for stronger collaboration among key stakeholders.
“We are not looking at AI in terms of compulsory for the education system because we recognise that while it is being used, it will be used by some and not by all, for example, in the pursuit of their assessments,” he said.
FACILITATE UNDERSTANDING
“Our general guidelines for SBAs are primarily what we have adopted. But in the event that students use AI once, we are saying parents need to primarily work with the schools, primarily teachers, and that dialogue between parents and teachers need to be directly with the teachers through the PTAs. It needs to happen to facilitate an understanding of how it is to be used. There is a critical need for teachers to be trained because they, too, are the ones who are going to be using it and the self-declaration of using AI needs to be managed in that space,” he added.
The policy promotes the concept of “responsible AI”, ensuring that students remain the primary creators of their work and use AI strictly as a supporting tool rather than allowing it to determine the final outcome.
Meanwhile, Paul Golding, lead researcher and professor of information systems at UTech, presented the study’s key findings, noting that for the first time, the research included blind and deaf students. The data revealed that only six per cent of able-bodied teens reported not using AI, compared with 30 per cent of blind teens and 20 per cent of deaf teens.
In line with the national trend, ChatGPT emerged as the most widely used platform. The study found that 69 per cent of non-disabled teens reported using ChatGPT once or more per day, compared with 20 per cent of blind teens and 54 per cent of deaf teens.
Golding added that most teens believe AI does not impact their creativity or critical thinking – an outlook that contradicts existing research, which suggests that AI use can negatively affect both creativity and critical thinking skills.

