Generative AI tools like ChatGPT, DALL·E, and Google Bard are increasingly used in education, research, and creative work. While these tools offer exciting possibilities, they also raise important copyright, ethical, and academic integrity concerns.
Who Owns AI-Generated Content?
Under UK copyright law (CDPA 1988, Section 9(3)), the author of a computer-generated work is considered to be the person who made the arrangements for its creation. However, this definition is unclear when it comes to tools like ChatGPT, where users simply input prompts.
⚠️ There is no clear legal precedent yet, and ownership of AI-generated content may be disputed—especially if similar outputs are generated for multiple users.
Ethical Considerations
According to UNESCO, using Generative AI tools like ChatGPT raises several ethical issues:
| Issue |
Description |
| Academic Integrity |
Risk of plagiarism or cheating |
| Privacy |
No age regulation; user data may be collected |
| Bias & Stereotyping |
AI may reflect and reinforce biased or discriminatory content |
| Accessibility |
Limited access in some regions |
| Commercialisation |
AI tools may use your input to train models for profit |
| Accuracy |
AI may generate false or misleading information, including fake citations |
🧠 Tip: Always verify AI-generated content and be prepared to take full responsibility for its accuracy.
Copyright and Citation
If you use Generative AI to create text, tables, figures, or images in your work:
- Check the tool’s terms and conditions—some outputs may not be unique or copyrightable.
- Cite the tool properly—e.g., “Generated using ChatGPT, OpenAI, October 2025.”
- Do not assume ownership unless the tool explicitly states that you own the output.