Generative AI in Education: Acceptable Use Policy Outline
As the popularity of generative AI increases, educational institutions must look to write acceptable use policies (AUPs). There are several good reasons to do this:
- Academic Integrity: Educational institutions have a responsibility to uphold academic integrity. A policy helps prevent plagiarism and ensures that learners do not use AI tools to create or submit work without proper attribution or authorisation and provides guidance to learners on what is and is not permissible regarding generative AI tools.
- Transparent and Accountable Practices: Creating an AUP establishes transparent and accountable practices for generative AI use. It helps ensure that all stakeholders understand their rights and responsibilities.
- Data Privacy and Security: The use of AI often involves data handling. An AUP can delineate policies for handling and securing data, ensuring compliance with data protection laws.
- Ethical and Responsible Use: An AUP sets clear guidelines for the ethical and responsible use of generative AI. It demonstrates the institution's commitment to ethical behaviour, transparency, and fairness in all AI-related activities.
- Alignment with Educational Objectives: By creating an AUP an institution can ensure that the use of generative AI aligns with its educational goals and objectives, and contributes to improving the learning experience
- Protection of Intellectual Property: Generative AI tools can create content that may have intellectual property rights (see the Copyright discussion on the AI ethics page). An AUP can define ownership, usage, and distribution of AI-generated content, protecting both the institution and individual users.
- Quality Assurance: A policy can help assure validation and accreditation agencies of the credibility of an institution
- Mitigating Legal and Ethical Risks: An AUP can help mitigate legal and ethical risks associated with the misuse of generative AI. It can define consequences for violations and protect the institution from potential liabilities.
- Stakeholder Confidence: Having a well-defined AUP can increase the confidence of learners, teachers and the broader community in the institution's commitment to ethical AI use.
Creating an Effective AUP
Some tips to help create a good policy:
- Define the scope: what is the policy going to cover? Think about the purpose, applicability, and limitations
- State the objectives: clearly state the objectives of the AUP, such as promoting ethical AI use, protecting academic integrity, and ensuring data privacy
- Emphasise the educational purpose of AI use:
- Emphasise the ethical use of AI, including prohibitions on cheating, plagiarism, and creating misleading content, but...
- Accentuate the positive: try to focus more on how generative AI can be used as opposed to stressing how it shouldn't be used
- Keep it simple: write the policy in plain English (like all your other institutional policies). The policy needs to be easily understood and so should be clearly and plainly written
- Clarify ownership and attribution of AI-generated content. Specify who owns the content and how attribution should be provided when AI tools are used
- Include guidelines for handling and securing data used in AI projects. Address compliance with data protection laws
- Clearly outline the consequences for violations of the AUP. Describe the disciplinary actions that may be taken, which may include academic penalties
- Ensure that the AUP complies with relevant laws and regulations, including copyright, data protection, and privacy laws
- Involve learners: get their input - understand how they are using generative AI. There will be more buy-in if learners understand the policy and the reasons it was created and implemented.
- Review the AUP regularly: generative AI is a fast-moving field, and the policy should be reviewed and revised regularly to ensure it covers the capabilities of AI tools in current use
Policy Outline
This outline is intended to give a flavour of what might be included in an institutional policy for the acceptable and responsible use of generative AI tools. It is not meant as a comprehensive list of everything that should be considered. Individual details and procedures will be specific to individual institutions.
Additionally, to help with support and compliance, all stakeholders, including staff, learners and managers should be included and consulted during any policy development process.
Introduction
- Purpose of the Policy
- Scope and Applicability
- Definitions
General Principles
- Ethical Use
- Academic Integrity
- Responsible Implementation
- Transparency
Guidelines for Generative AI Use
- Permissible Use Cases
- Learning Enhancement
- Creative Expression
- Academic Support
- Examples of Permissible Use
- Use Restrictions
- Prohibited Activities
- Plagiarism Prevention
- Responsible AI Tool Selection
- Evaluation Criteria
- Terms and Conditions of Use/Licencing
- Citation of AI Tools
- Acknowledgement of Tool Use
- Referencing Conventions
- Copyright
- Data Privacy and Security
- Data Handling
- GDPR Compliance
- User Consent
- Accountability and Oversight
- Designated Responsible Parties
- Monitoring and Reporting
- Training and Awareness
- Staff Training
- Student Awareness
Implementation Procedures
- Policy Dissemination and Awareness
- Integration into Curriculum
- Staff Approval
- Curriculum Development
- Staff and Student Access
- Availability of AI Tools
- Student Accessibility
- Evaluation of AI-Generated Work
- Marking and Assessment
- Plagiarism Detection
- Technical Support
- IT Requirements
- Assistance for Users
Compliance and Consequences
- Compliance with Policy
- Compliance Review
- Consequences for Non-Compliance
- Academic Penalties
- Ethical Violations
- Appeals and Grievance Procedures
Review and Updates
- Periodic Policy Review
- Horizon Scanning
- Assessment of Technological Advances
- Stakeholder Input
Conclusion
- Commitment to Ethical and Responsible AI Use
- Affirmation of Academic Integrity
- Importance of Continued Learning and Adherence