“As the influence of AI continues to surge across various industries, its presence in education is undeniably growing. While the integration of AI in academic settings offers unprecedented opportunities for efficiency and innovation, the temptation to employ it for school work raises pertinent concerns about ethical implications and the importance of fostering genuine learning experiences.”
Those two sentences might sound like they’ve been carefully crafted for a term paper on academic integrity, but they were actually generated by ChatGPT, an artificial intelligence program, using a simple prompt.
As AI technology continues to develop and gain popularity, it can be hard to know where and when it’s acceptable to use. What constitutes a breach of academic integrity? Are there any acceptable uses of AI software in class work?
We checked in with experts in MacEwan’s Centre for Teaching and Learning and Academic Integrity Office. Here’s what they had to say.
Can I use AI in my classes?
It depends on learning outcomes and professors’ teaching preferences.
In terms of academic freedom, faculty can determine for themselves to what extent – and what tools – they will allow in their classroom. There may be instances where students need certain AI skills for their specific industry, so some programs and profs may use more AI tools than others.
The university’s Centre for Teaching and Learning has created a document detailing ethical use of generative AI and the Library has developed an Artificial Intelligence – Student Guide, but ultimately you’ll need to check your syllabus and chat with your prof about what is and is not allowed.
What is the penalty for using AI without permission?
According to MacEwan’s Academic Integrity policy, using AI can constitute a breach of academic integrity. Academic misconduct occurs when a student tries to gain an unfair advantage, or to use something that's not permitted, both of which can apply to unapproved AI use.
Depending on the level of improper generative AI use, it can constitute severe academic misconduct and come with serious consequences.
What programs count as AI?
AI isn’t limited to ChatGPT. Across the creative arts, there are AI tools that generate content, including writing, illustrating, designing and even composing music.
Take caution when using online platforms or services, as sometimes AI isn’t as obvious as you might think. Be careful of things like search engines – Bing has Copilot and Google has Gemini – or using tools like Grammarly or The Quote Bot to paraphrase work. Even Microsoft Office has AI components that will suggest alterations to your work. These are all considered generative AI and, depending on the learning outcomes and your professor’s teaching strategies, could be considered academic misconduct.
It seems more AI bots are being released every day. Check with your prof about what tools you can and cannot use in your classes.
Are there limits to AI I should know about?
Though AI is quite new in a lot of its forms, something called model collapse could be on the horizon. If AI is sampling the internet for content, it could actually use other AI-generated content. This causes a degradation of both quality and credibility of future content.
If you are using AI, be sure to think critically about its outputs. Make sure you’re still showcasing the knowledge and skills that you’re expected to learn in your courses and assignments without the help of the AI you’re using.
Who can I talk to if I have questions?
The best place to start is always to talk to your prof.
Be proactive and approach your prof with any questions, or even with ideas for ways that AI can be used in course work.
If you still need help figuring out where to draw the line with AI, contact MacEwan’s Academic Integrity Office for resources or one-on-one meetings.