Generative artificial intelligence (AI) in assessments
What is generative AI?
Generative artificial intelligence is a label used to describe any type of artificial intelligence (AI) that is used to create text, prose, formulae, code, images, video or audio. ChatGPT and Google Gemini are two examples of generative AI tools.
AI outputs can be very human-like, potentially increasing the risk of plagiarism.
Use the tabs below to open each section individually. Alternatively you can show all the sections.
AI in session 2024-25
Our current position on the use of generative AI in session 2024-25 has been developed to ensure equity and fairness for all learners studying our qualifications.
Learners cannot submit AI outputs as their own work
Learners are not permitted to use generative AI tools to create outputs – for example text, prose, formulae, code, images, video, audio – that they then submit as their own work for assessment tasks that contribute towards an SQA qualification. These tasks include: exams, unit assessments, coursework, and portfolios. Doing so would constitute plagiarism and could result in awards being cancelled.
AI cannot be referenced as a source
Learners must not include outputs from generative AI tools that are referenced as a source for assessment tasks that contribute towards an SQA qualification. There are currently some significant issues regarding the reliability and validity of these outputs that mean referencing the tools could be inappropriate or disadvantageous to learners.
Rationale for our current position on AI
Referencing
Learners studying towards SQA qualifications should use valid, reliable and authoritative sources of reference to support their work.
There is evidence that outputs from these tools can be biased, incorrect, and can fabricate information. Outputs can also be inconsistent, even when using the same prompts – making it difficult for assessors to authenticate the sources. For these reasons, outputs from generative AI tools are not currently considered valid or reliable.
Using outputs from generative AI tools as sources may not meet the referencing requirements of specific courses and could impact the number of marks a learner can achieve. For example, some SQA qualifications require sources to be recent and text output from generative AI tools can be difficult, or impossible, to date.
Age restrictions
An important factor, which could impact equity and fairness for learners, is the age restriction that the creators of generative AI tools have placed on their products. For example:
- Google Gemini - users must be over 18 years old
- ChatGPT - users must be over 13 years old, but if they are under 18 written consent from a parent or carer must be provided to Open AI (the creators of ChatGPT)
Guidance for centres: authenticating learners' work
With the availability of AI chatbots, such as ChatGPT, which can quickly produce human-like text, it is important that you are aware of the appropriate authentication steps to take.
We have produced a guide to support centres in ensuring learners’ work is their own:
Future use of AI
As generative AI advances, the barriers and the current flaws within this developing technology can be overcome. We understand the need to embrace the opportunities that new technology offers and will continue to keep our position under review.
The advances in generative AI hold the possibility of opportunities for education and assessment. Together with our partners in Scotland’s education system, we continue to explore those opportunities. We welcome the recommendations for AI use in the recently published Independent Review of Qualifications and Assessment and we look forward to working with others in the education sector.
We will continue our work on the use of generative AI tools in the assessment context. Any changes to assessment practice will be based on evidence and will include the views of centre staff and learners. We will publish further guidance where appropriate.