The integration of generative AI in university assessments brings significant implications. AI can help students to plan, prepare and enhance their assignments. However, it also poses challenges to academic integrity as tools capable of generating complex responses become more accessible. This means that you need to consider the subject context, the course and unit learning outcomes, acceptable use of AI in the students’ learning journey and the assessments you design.
Deciding how and when AI should be used by your students is not always a straightforward and easy process to determine.
When can students use AI?
Students can use Generative AI in the developmental process of their work. They must keep track of how they use Generative AI according to UAL’s Student Guide to AI.
When using Generative AI tools, students should be aware that the use of the tool(s) may constitute academic misconduct via plagiarism. Students may not use AI to generate their work unaltered and then submit for assessment as if it was their own work. The only exception to this, is where the course content/assessment brief permits the use of AI generated work in this way.
If students have any questions or need further guidance about the use of AI they should speak to their tutor.
Traffic light system
Use this system to create a shared understanding between you and your students to clarify what is acceptable use of AI in an assessment.
Red: Not permitted
In these assessments, you must tell your students that the use of AI is not permitted.
Usually, the main reason why generative AI is not permitted is that its use would enable the students to bypass the learning process. These assignments are likely to demonstrate basic core skills of learning such as remembering, understanding, critical thinking or demonstrating transferable skills.
Examples of assessments where AI tools are not allowed could include:
- a Crit
- written reflective journal which is aimed to demonstrate a student’s thought process and their ability to critically think, analyse and evaluate
- vivas
- some studio and practical work
- discussion-based assessments
- where spoken and written English language skills need to be assessed
- in-person unseen examinations
- encouraging analogue design (e.g. textiles) in an attempt to be more sustainable or environmentally conscious
Students believed to have ignored the categorisation will undergo the standard academic misconduct procedure.
Amber: Context specific
In these assignments you will specify how AI tools can be used in an assistive role for the assessment.
You must explicitly communicate to your students how the AI tools can be used in the assignment. For example, you might allow the AI tool to be used in the formation of ideas or concepts in the creative process, but the student must record how the AI generated output was incorporated into the final submission. (Refer to the UAL Student Guidance).
Examples include:
- drafting and structuring content
- as a support tutor .g. facilitating peer-like dialogue and debate to real world problems.
- supporting a particular process such as testing and debugging code or translating content
- providing ideas or inspiration to help you get started on an assessment brief.
Green: Permitted
In these assignments the use of AI will explicitly form part of the assignment task and learning outcomes.
You should expect your students to use generative AI responsibly and critically to tackle complex problems and generate informed solutions. There may also be occasions where students are given the option to not engage with AI on the basis of their own practical or ethics concerns.
You will need to guide and support students in their use of generative AI and show them how to record and acknowledge the outputs of the tool they have used (Refer to UAL Student Guidance). For speakers of English as an additional language, recording use of translation tools may be an additional burden.
Examples of where generative AI tools could be used as part of the assessment include:
- creating artwork (images, audio and videos)
- developing code
- drafting and structuring content
- generating ideas
- comparing content (AI generated and human generated)
- creating content in particular styles
- producing summaries
- analysing content
- researching and seeking answers
- playing a supportive role and engaging in a conversational discussion
- translating content for speakers of English as an additional language
Updating assessments
Deciding whether to modify your assessments so that they cannot be done easily with generative AI depends on several factors, including the nature of the course, the learning outcomes, and how you want to integrate technology into learning.
Making substantial changes to your summative assignments must be done in accordance with UAL guidance. However, waiting for the course to be reapproved or seeking modifications to an existing assessment method might take some time but there are several things you can do right now.
Try an AI tool.
We have access Microsoft Copilot. You must login to Copilot with your UAL credentials. Using your UAL email to log in grants you protected status, ensuring your chat history is not stored and your data is not used for AI training.
If you want to test exam questions or other assessment tasks, this is the only safe tool to use.
Talk to your students.
Clearly define the boundaries of AI usage for their assignments, specifying what is permissible and what isn't. Discuss the ethical issues of using AI in their assignment Emphasise to your students the importance of originality in their work and inform them that presenting AI-generated content as their own would be considered as academic misconduct.
Increase feedback and formative assessment opportunities.
When students better understand the assessment, they are more capable and motivated to produce their own responses, reducing the temptation to rely on AI-generated content.
Changing your assessment briefs
Before you make any changes, discuss the proposed changes with your teaching team, students and any external regulators.
Here are some strategies to consider:
Emphasise critical thinking and analysis.
Design questions that require students to analyse, synthesise, and evaluate information rather than merely describe or summarize it. Questions that ask for personal interpretation, critique, or a comparison of different viewpoints are less likely to be directly answerable by AI.
Require specific examples and contexts.
Incorporate specifics from your lectures, discussions, or unique course materials that are unlikely to be widely available or well-known. This encourages students to apply what they've learned directly from your course and demonstrates their engagement with the material.
Ask for personal reflection.
Include a component that requires students to relate theories or concepts to their personal experiences or future aspirations. This personalisation makes it difficult for AI to generate relevant and accurate responses.
Apply to unique situations.
Design questions that ask students to apply learned concepts to hypothetical or novel situations. This tests their ability to extend their knowledge beyond familiar contexts and demonstrates genuine understanding.
Facilitate peer reviews and revisions.
Include a component where students must critique or build upon peers' ideas. This interaction adds a layer of complexity that AI tools struggle to replicate authentically.
What you can do in the future
If you are considering changing your assessment method, a good starting point is to review Jisc’s National Centre for AI resource: Assessment ideas for an AI enabled world.
Do we use AI to mark my students work?
No, we do not use Generative AI in the feedback and grading process.
The principal purpose of marking is to provide students with personal feedback on their performance. Currently, tutors marking their students work have the ability to provide personalised, context-aware feedback that considers individual student needs and creates a supportive learning environment that AI currently cannot replicate, with the same depth and understanding.
There are a number of reasons why this is our position at the moment:
Privacy concerns
Using AI to grade assignments raises concerns about data privacy and the ethical use of student work. Students’ assignments should never be uploaded to a generative AI tool (without the students’ permission) as this will contravene data protection and copyright legislation.
Lack of nuanced understanding
Generative AI may not fully grasp the nuances of complex arguments, original thought, or creative expression. These aspects are often crucial in educational assignments, particularly in arts-based education. An AI's ability to interpret and evaluate these subtleties can be limited compared to a human teacher.
Feedback quality
Effective learning often requires detailed and constructive feedback, something that AI currently struggles to provide at a human level. AI might be able to assess answers as right or wrong, but it may fail to provide the insightful feedback needed to help students understand their mistakes and learn from them.
However, there may be some exception to this position:
Multiple choice and standardized tests
AI is particularly adept at grading assignments with clear right or wrong answers, such as multiple-choice questions, true/false quizzes, and other standardized test formats. These tasks don't require subjective judgment, making them ideal for automated grading systems. These types of questions and tests can also be done using existing tools in Moodle.
Programming and code evaluation
In creative computing education, AI can assess coding assignments effectively. This allows for objective evaluation of code correctness, efficiency, and adherence to specified requirements.
AI grammar and punctuation tools
Tutors may use tools that use AI tools to improve the grammar and feedback of their feedback. Tutors must not input any student names or personal information into any generative AI tool.
If these exceptions are appropriate, you must always:
- inform your students and explain why their work is being marked using Generative AI
- anonymise students' work and ensure it does not contain any personal information
- take responsibility for the overall grade given to the student and ensure to check and confirm any grade generated by an AI tool.
Learn more
-
Mo Oguntuase, 2023 BA Public Relations, London College of Communication, UAL | Photograph: Alys Tomlinson
AI and Teaching
Learn how we approach using AI more broadly in teaching and learning.
-
2017 Information and Interface Design, London College of Communication, UAL | Photograph: UAL
AI and Arts Education
Use our guides and resources to learn how to use AI in your learning and teaching.