Master CXC Exams TT

CXC's AI Rules for SBAs

Everything you need to know for the 2026 May-June Examinations

Updated
March 13, 2026
Includes
CXC's March 10 Memo
What's In This Guide
1. What Changed and Why It Matters
6. How CXC Checks Your Work
2. The 4 AI Scale Levels (What's Allowed)
7. What CXC's March 2026 Memo Clarified
3. The 20% That Isn't a Standard
8. Approved Tools and Quick Answers
4. How to Cite AI in Your SBA
9. What Parents Need to Know
5. Penalties for AI Misuse
AI Scale Levels
4
CXC limits SBAs to Level 3
Max Penalty
0 marks
For full AI submission
Source Documents
4
Consolidated in this guide

This guide consolidates: Standards and Guidelines for the Use of AI in Assessments (April 2025), Responsible Generative AI Policy Framework (November 2024), FAQ for Standards and Guidelines (2025), and Memo: Managing AI Checkers in SBAs (March 10, 2026).

1
What Changed and Why It Matters

Before 2025, CXC had no published rules governing how students could use AI in their SBAs. No scale levels, no penalties, no formal detection process. That changed in three steps.

November 2024
Responsible Generative AI Policy Framework
CXC approved the overarching policy framework for AI use across the regional secondary education system.
April 2025
Standards and Guidelines for the Use of AI in Assessments
The governing document. Specific rules, scale levels, penalties, and compliance requirements for SBAs.
March 2026
Memo: Managing AI Checkers in SBAs
CXC clarified how AI detection will actually work in practice during this first year of implementation.
Caribbean AI Policy Readiness
CXC survey of 16 Caribbean Ministries of Education
No AI Policy
69%
11 of 16 Ministries
Still Developing
25%
4 of 16 Ministries
Completed Policy
0%
0 of 16 Ministries

Most schools in the region are still catching up to rules that are already in effect. These rules apply whether or not your school has communicated them yet.


2
The 4 AI Scale Levels (What's Allowed)

CXC classifies AI use into four levels. Understanding these is key to staying compliant.

"At present, CXC limits the use of AI to Level 3."
— Standards and Guidelines, p. 6

1
No AI
No AI used at any point. Applies to all written exams (Paper 1, Paper 2).
Exams Only
2
AI-Assisted Idea Generation
AI for brainstorming and planning only. No AI content in final submission. All prompts must be documented in the Appendix.
Allowed
3
AI-Assisted Editing
AI can improve clarity of work you already created. It can generate visual content (videos, images) for presentation. AI cannot create new written content. All use must be cited.
Current Limit
4
AI Task Completion
AI completes parts of the task and you provide commentary. CXC does not currently allow this level.
Not Allowed
What Each Level Allows
Activities permitted at each AI Scale Level
The Line Between Level 3 and Level 4

This is the most common area of confusion. The key question: Did AI create any written content that ended up in your final submission? If yes, that is Level 4, regardless of how much you edited it afterwards.

✓ Level 3 — Safe
  • You wrote your analysis. AI helped you clean up grammar.
  • You created your findings. AI helped make the presentation clearer.
  • You wrote your discussion. AI suggested edits to improve clarity.
✗ Level 4 — Violation
  • AI wrote part of your analysis. You reworded it.
  • AI generated findings that you edited and submitted as yours.
  • You prompted AI to write a section and then adjusted it.
Quick Check: Where Does Your SBA Fall?
One question determines your compliance level
Did AI generate any written content that ended up in your final submission?
NO
Did you use AI at all? (brainstorming, grammar cleanup, generating images)
YES
Level 2 or 3
Cite it. Document prompts. Submit originality report.
NO AI AT ALL
Level 1
No Disclosure Form or Originality Report needed.
YES
LEVEL 4 — VIOLATION
50% deduction or 0 marks. It does not matter how much you edited it afterwards.

3
The 20% That Isn't a Standard

If you have heard that your SBA is safe as long as it scores below 20% on an AI detection tool, that is not what CXC's own documents say. The 20% does appear in CXC's FAQ. It is a real number in a real CXC document. But the way students and teachers are repeating it does not reflect what the FAQ actually says.

What you need to know: The 20% is not what protects your SBA. The compliance process is. Declare your AI use, cite it, document your prompts, run an originality checker, and submit the report. That process comes from CXC's governing document and was confirmed by the March 2026 memo. Follow the process in Section 4 of this guide and you are covered. Everything below explains why.

What the FAQ Actually Says About the 20%

Two questions in the FAQ mention the 20%. The framing is not what most people think, and it matters for how you prepare your SBA.

17
FAQ Q17: "What is the acceptable AI similarity score?"
"The acceptable percentage similarity is set at 20%."
19
FAQ Q19: "Is there an acceptable variance on the AI similarity score?"
"Yes, however if a candidate's similarity score is above 20% the teacher should provide a rationale in the AI disclosure form...note however that this does not guarantee that the candidate will not be penalised."
What this means.

The FAQ does not say "under 20% means you are safe." Based on the language in Q19, above 20% triggers a justification requirement from the teacher, and even with that justification, the student can still be penalised. Based on CXC's own wording, the 20% functions as a penalty trigger, not a guarantee of safety.

The concern is that students are hearing "20%" and interpreting it as "under 20% means I'm fine." But Q19 makes clear that even above 20%, with a teacher's written justification, the student can still be penalised. If the override does not guarantee safety, the threshold itself is not solid ground to build your compliance on.

This number does not appear in the Standards and Guidelines. It does not appear in the March 2026 memo. It appears only in the FAQ.

Where the 20% Lives: Document Hierarchy
Not all CXC documents carry the same weight. The 20% comes from the lowest tier.
Governing Document
Standards and Guidelines (April 2025)
Sets the rules, scale levels, penalties, and compliance requirements.
The 20% is NOT in this document.
Clarification
March 2026 Memo (Dr. Nicole Manning)
Confirmed human judgement, not a number, will decide outcomes.
The 20% is NOT in this document.
Subordinate Document
FAQ (2025)
Answers common questions. Does not set standards or policy.
The 20% lives here, and only here.
Why This Matters: Every Other FAQ Answer Aligns with the Policy

This is important context. Look at the rest of the FAQ. Can AI be used in exams? No. That aligns with Level 1. Can AI generate code for IT SBAs? No. That aligns with Level 3. Can AI suggest a topic? Yes. That aligns with Level 2. Can AI generate graphs? Yes, if the rubric does not allocate marks. That is Level 3 in action. Every answer traces back to a scale level in the governing document. Every answer except the 20%.

What This Looks Like in Practice

Students hear "20%" and think: under means safe, over means caught. Neither assumption holds up when you look at how compliance actually works. Here are two real scenarios:

8%
on AI Detector
Still a violation. A student had AI write their analysis paragraph, reworded it, and submitted it. That is Level 4, regardless of what any detector says. The detector missed it. CXC's other review layers may not.
30%
on AI Detector
Could be perfectly compliant. A student wrote everything themselves. The tool threw a false positive on naturally written work. CXC's own memo acknowledges this happens. That is why they said human judgement, not a number, will decide.

A percentage does not tell CXC whether you crossed the line. The process does.

CXC's March 2026 Memo Reinforced the Process, Not the Number

This is what CXC said most recently. Without ever mentioning the 20%, three lines from the memo made it clear that no single number will decide your SBA's outcome:

1
"Decisions will not be based solely on the AI Originality Report."
If the report does not decide on its own, then any number on it, whether 20% or otherwise, cannot be the threshold.
2
"We recognise the variability of some AI detection tools."
CXC is acknowledging these tools produce inconsistent results. A number that changes depending on which tool you use is not a standard.
3
"Concerns will be reviewed using human judgement and supporting evidence."
The standard is a human reviewing your process, not a machine generating a percentage.
The FAQ itself confirms this approach

FAQ Q21: "If a candidate's original work is flagged as AI-generated, the consequences can include academic penalties. However, CXC also requires human review and supporting evidence before finalising such accusations, and candidates have the right to appeal or defend their work."

This is CXC, in the same FAQ, confirming that no score on its own determines the outcome. Human review, supporting evidence, and the right to defend your work. That is process-based evaluation.

If You Are Hearing the 20% From Your Teacher

Some teachers are repeating the 20% as a compliance threshold because it is the most concrete number available to them. They are not trying to mislead you. The policy is new, and most schools have no AI policy of their own. But even CXC's own FAQ says that scoring above 20% "does not guarantee that the candidate will not be penalised." The number is a flag, not a finish line. And it comes from the FAQ, not the governing document.

Follow the process in Section 4 of this guide and you are covered.

What the research says

Peer-reviewed studies consistently find that AI detection tools produce false positives at rates that make hard percentage thresholds unreliable, particularly for non-native English speakers. A 2025 evidence synthesis published in MDPI's Information journal, covering research from 2021 to 2024, concluded that these tools "frequently produce false positives and lack transparency." A Stanford study of over 10,000 text samples found false positive rates exceeding 20% for non-native English speakers. Turnitin's own guidance treats scores below 20% as too unreliable to be used as evidence.

Major institutions that initially adopted percentage thresholds are now moving toward process-based evaluation, which is exactly what CXC's Standards and Guidelines already describe. The process-based approach is the stronger position internationally.

Sources: Weber-Wulff et al. (2023), International Journal for Educational Integrity. MDPI Information evidence synthesis (2025). Stanford false positive study (2025). MLA-CCCC Joint Task Force on Writing and AI.

To be clear.
What this guide is saying

The 20% appears in CXC's FAQ. It is a real number in a real document.

But the governing document, the Standards and Guidelines, uses a process to evaluate compliance, not a percentage. The March 2026 memo confirmed this. The FAQ's own Q21 confirms this.

Following the process in Section 4 of this guide is what protects you. That process comes directly from CXC's own published standards.

What this guide is NOT saying

We are not saying ignore the 20%.

We are not saying the FAQ does not matter.

We are not saying CXC got it wrong.

We are saying that a number in the FAQ should not replace the compliance process that CXC's governing document requires. CXC's own memo and FAQ Q21 support this reading.

A memo is not a policy, and a percentage is not a standard.


4
How to Cite AI in Your SBA

If you used AI, even just for brainstorming, you must cite it in CXC's required format.

In-Text Citation

Include the name of the AI platform and the year it was accessed.

Format: (Platform Name, Year)
Example: (Copilot, 2025)
Appendix Reference: 5 Required Elements
1Platform Name
2Year Used
3Your Full Name
4Exact Prompt (italics)
5Day / Month
Example:
Copilot (2025), John Doe, "Generate three ideas for an agro-processing business targeting young adults", 3 March.
Where does this go?

Your AI citations go in your Appendix. Not your bibliography. Not a footnote. Your Appendix.

Source: Appendix A, Standards and Guidelines, p. 22

The Declaration

You may also be required to sign a declaration stating:

"I/We certify that this project is a true reflection of my/the group's own work. I/We have cited all sources of information."

— Appendix E, Standards and Guidelines, p. 26

No AI used at all? CXC's March 2026 memo confirmed: if no AI was used, the Disclosure Form and Originality Report are not required.


5
Penalties for AI Misuse
ViolationPenalty
Full use of AI in the completion and submission of the SBA0 marks awarded
Violating the expected Scale Level (e.g. Level 4 when limited to Level 3)50% deduction of earned mark
Example: 50/60 becomes 25/60

"Violations of the expectations of the use of AI according to the Scale Levels will attract 50% deduction of the earned mark." — Standard 4.1(b), p. 17

What a 50% Deduction Actually Costs You
Example: Economics SBA Weight
20%
of your entire CSEC grade
After 50% Deduction
10%
of your final result, gone
Full AI Submission
0%
entire SBA, wiped

A 50% deduction on your SBA means you just lost 10% of your final result on one mistake.

How One Penalty Cascades Through Your Final Grade

A scale level violation does not just reduce your SBA mark. It reduces the percentage that SBA contributes to your entire CSEC result.

SBA Score
50/60
50% Penalty
25/60
CSEC Impact
−8.3%
Could Mean
Grade Drop
The takeaway:

The penalty does not stop at your SBA. It pulls down your overall CSEC percentage, and that can be enough to move you from one grade boundary to the next. One violation, one subject, one grade.

SBA Mark Impact Scenarios
Example: Student earns 50/60 on SBA worth 20% of CSEC grade

6
How CXC Checks Your Work

CXC uses a multi-layer detection process. No single tool decides your outcome. It is a combination of human review and technology.

👩‍🏫
Teacher Validation
Reviews work before submission. Reports suspected AI misuse to CXC.
📋
CXC Moderation
Moderators review for plagiarism and inappropriate AI use.
🗣️
Oral Questioning
CXC reps can visit schools. If you cannot explain your SBA, that is a red flag.
🔍
Originality Checkers
Must run SBA through an approved checker and submit the report.
Important

A teacher's approval alone does not guarantee the SBA will pass CXC's moderation. The final decision rests with CXC, not the school.


7
What CXC's March 2026 Memo Clarified

NEW: CXC Memo dated March 10, 2026
"Managing Artificial Intelligence (AI) Checkers in School-Based Assessments (SBAs)" — Dr. Nicole Manning, Director of Operations

Decisions Won't Rely Solely on AI Detection

"We wish to assure you that decisions will not be based solely on the AI Originality Report."

Detection Tools Are Unreliable

"We also recognise the variability of some AI detection tools."

Human Judgement Will Be Used

CXC will rely on the Disclosure Form details, and concerns flagged during marking will be reviewed using human judgement and supporting evidence.

No AI = No Paperwork

If no AI was used, the Disclosure Form and Originality Report are not required.

CXC Will Engage Schools First

If CXC needs more information, they will reach out to Centres and/or the Local Registrar ahead of releasing preliminary results.

Stakeholder Engagement Coming

CXC committed to further engaging stakeholders in focus group sessions during July–August 2026.

Key Takeaway: The Standards and Guidelines outline a process: declare your AI use, cite it, document your prompts, run an originality checker, submit the report. CXC's March 2026 memo confirmed that this process-based approach is how they will evaluate SBAs. Follow the process and you are covered.

The CXC Compliance Process
Declare
AI Use
Cite
It
Document
Prompts
Run
Checker
Submit
Report

8
Approved Tools and Quick Answers

CXC specifies which AI platforms and originality checkers are approved for use.

Approved AI Platforms
CopilotGeminiChatGPTIncus
Approved Originality Checkers
Winston AIOriginality.aiGPTzeroGrammarly

"This list may be updated from time to time through the advisement of Ministries of Education." CXC's March 2026 memo also noted that CoPilot (Microsoft) is one example, not the only option. Students and teachers are encouraged to use any available AI detection tool accessible to them.

Quick Answers from CXC's FAQ

These answers come from CXC's FAQ and are consistent with the Standards and Guidelines.

QCan AI be used in exams?
No. AI during examinations is strictly prohibited.
QCan AI be used to generate code for IT/Computer Science SBAs?
No. AI-generated code cannot be considered the student's own work.
QCan AI suggest a topic?
Yes. AI can be used for ideation, as long as you collect and develop the content yourself.
QCan AI generate graphs?
Yes, provided the rubric does not allocate marks for the graphs.
QCan AI help write analysis?
Students must conduct their own analysis. AI can only be used to edit or format after.
QWill candidates be marked down for using AI to paraphrase?
In most cases no, however there are some subjects, for example English Language, that do not lend themselves to AI rephrasing the material written.
QWhat citation format?
Use the format in Appendix A of the Standards and Guidelines (see Section 4 of this guide).
FAQ Quick Reference: What's Allowed?
At a glance, what CXC says yes and no to

9
What Parents Need to Know

"Teachers and parents alike play a vital role in ensuring the integrity of the submissions of the candidates for the examinations." — Standard 4.0, p. 16

CXC explicitly names parents in the responsibility chain. This means CXC expects you to be aware of the rules and to help ensure your child's SBA follows them.

Do this first
Important
Good to know
Sources
  1. CXC Standards and Guidelines for the Use of AI in Assessments (V1.0, April 2025)
  2. CXC Responsible Generative AI Policy Framework (November 2024)
  3. CXC FAQ for Standards and Guidelines for the Use of AI in Assessments (2025)
  4. CXC Memo: Managing AI Checkers in SBAs (March 10, 2026, Ref: 625-1)