Why Assessment Design is Key to Academic Integrity in the Age of AI

As AI tools grow more sophisticated, detecting their use in academic writing has become an increasingly complex challenge for educators. With advancements making detection tools less reliable and more costly to maintain, the question arises: is detection a sustainable solution, or is there a better way forward?

During the OpenLearning Forum 2024, the conversation shifted from AI detection to a broader rethinking of assessment design—requiring creativity and authentic learning experiences.

This blog post explores the limitations of detection-focused approaches and offers practical strategies for designing assessments that uphold academic integrity while responsibly integrating AI in education.

 

Tackling the Root Causes of AI Plagiarism

Why do learners turn to AI for writing assignments? Common reasons include:

  • Grade-centric courses: Learners prioritise grades over actual learning.
  • Lack of relevance: Assignments seem disconnected from their personal goals or interests.
  • Heavy workloads: Competing priorities push students to seek shortcuts.
  • Unclear expectations: Assessment criteria or instructions are ambiguous.
  • Low engagement: Courses lack interactive or collaborative opportunities.

By addressing these underlying issues, educators can foster trust and authenticity in learning environments, reducing the likelihood of AI misuse.

 

Why Detection Alone Won’t Work

AI models are evolving rapidly, making them harder to identify. Detection software is struggling to keep pace, and even emerging solutions like invisible watermarks face limitations. For example:

  • Watermarking isn’t foolproof: AI tools designed to bypass detection can easily remove or avoid watermarks.
  • Expensive maintenance costs: Detection tools require expensive upkeep, with inconsistent results as AI capabilities improve.
  • False positives/negatives: Legitimate work may be flagged as AI-generated, or AI-written content may go undetected.
  • Ethical concerns: Detection often involves extensive data collection, raising privacy and surveillance issues.
  • Lack of standards: AI detection lacks universal benchmarks or standards, leading to inconsistencies in results and difficulty in verifying accuracy.
  • Focus on policing: Overemphasis on detection creates an adversarial dynamic between educators and students, shifting focus from learning to policing.

Not to mention, detection efforts create a mismatch with real-world AI use: AI tools are becoming integral to professional workflows. Punishing students for using them contradicts the educational goal of preparing learners for real-world contexts where AI is widely accepted.

Relying solely on detection is, therefore, no longer a sustainable solution.

 

Rethinking Assessment Design for Academic Integrity

Assessments can be designed to focus on skills and activities that require uniquely human abilities—like critical thinking, creativity, and personal reflection—which are challenging for AI to mimic effectively. 

The goal is to reduce the likelihood that learners will misuse AI by creating tasks that AI tools cannot easily complete without meaningful human input. Here’s how:

 

1. Focus on ‘How’ Learners Learn, Instead of ‘What’ They’ve Learned

Design assessments that encourage learners to demonstrate their understanding through real-time or iterative activities. By capturing how learners engage over time, educators can gain deeper insights into learning processes, reducing reliance on static, text-based assignments. Examples include:

  • Reflective journals: Learners regularly document their learning journey, tying experiences to course outcomes.
  • Project-based learning: Learners create unique projects or solutions linked to real-world challenges.
  • Iterative activities: Learners refine their drafts, prototypes, or experiments based on feedback.

OpenLearning’s AI Course Builder can generate structured project guidelines, scaffolded tasks, and reflection prompts that align with best practices, making it easier to integrate process-focused activities.

 

2. Personalise the Learning Experience

Design assessments that draw on learners’ unique perspectives, experiences, and interactions. These tasks not only make assignments harder for AI to replicate wholesale but also foster stronger personal connections to the material. Examples include:

  • Personalised assignments: Learners tie theoretical concepts to their own lives or careers with activities such as role-playing scenarios and sharing experiences.
  • Group discussions and peer feedback: Learners co-create knowledge by discussion, crowdsourcing, or critiquing peers’ work.
  • Flexible learning pathways: Learners choose the topics or case studies that resonate with their interests or professional aspirations.

OpenLearning’s Activity Generator suggests collaborative projects, discussion prompts, and scenarios that relate the course material to learners’ own lives.

 

3. Teach Responsible AI Use

Rather than avoiding AI entirely, incorporate assignments that help learners critically evaluate and ethically integrate AI into their workflows. This approach prepares students for professional environments where AI is an essential tool and demands a deeper understanding of its role and limitations. Examples include:

  • AI-assisted tasks: Learners use AI for specific parts of an assignment, such as brainstorming or drafting, and then analyse and refine its contributions.
  • Critical reviews: Learners evaluate AI-generated content, comparing it to human-created outputs to identify strengths, weaknesses, and potential biases.
  • Scenario-based ethics discussions: Create activities where learners debate or reflect on the ethical use of AI in their field.

By shifting the focus to human-centric skills, assessments become less susceptible to being completed by AI alone, reducing the temptation for AI plagiarism and misuse.

OpenLearning’s AI Facilitation Tools suggest reflective comments and prompts, saving educators time while encouraging critical engagement with AI.

 

Conclusion: A Shift in Focus

Meaningful assessment design, thoughtful course development, and responsible AI integration, educators create learning environments where AI misuse becomes less effective and less appealing.

By shifting from detection to prevention, educators can uphold academic integrity while equipping learners with skills to navigate an AI-driven world.

Ready to transform your assessments? Discover how OpenLearning’s AI tools can help you design authentic, human-centric learning experiences. Learn more here.

 

New call-to-action

Topics: Course Design Tips

Get more resources in your inbox

back to top button