Using generative AI in teaching and learning

In this article:

What is generative AI?

Generative AI is a type of artificial intelligence that uses machine learning to create new types of media, including text, images, sound, and video. Many tools exist that use generative AI to create new content. Some examples include:

Media

Example tools

Media

Example tools

Text

ChatGPT, Bing

Images

Bing, Adobe Firefly, Midjourney, DALL-E2

Sound

Voiceify, texttomusic

Video

D-ID

Tool approval: All third-party tools used for teaching and learning at McGill are subject to the McGill Cloud Directive and must be evaluated to ensure they comply. Several generative AI tools are in the process of being evaluated for compliance. This article will be updated when more information about approval of generative AI tools becomes available.

In order to provide the most secure and low-risk use of generative AI tools for teaching and learning, it is recommended that instructors use tools that can be used anonymously without login, such as:

Generative AI tools are built upon large language models (LLM) that have been trained on enormous corpora of information and are designed to respond to input prompts with well-structured output. They are focused on predictive text (i.e., the most likely word to follow a previous word), as well as generated media based on patterns or styles in existing datasets. While the outputs can often demonstrate seemingly sophisticated language interaction, there may be situations where they provide inaccurate information in that interaction (e.g., fictional sources). These tools are constantly improving; however, it is important to examine any generative AI output with a critical lens. The McGill Library has a page on artificial intelligence with a section on AI Literacy that discusses the importance of critically approaching AI.

Ethical use of generative AI 

The development of generative AI has brought many ethical considerations that must be addressed with its use. The 2018 Montreal Declaration for a Responsible Development of Artificial Intelligence and the 2021 UNESCO Recommendation on the Ethics of Artificial Intelligence provide a number of principles important to the adoption of these new technologies. Both documents offer an excellent framework for how organizations need to consider these new technologies. 

The emergence of generative AI is leading to a process of reflection on the evolution of the concept of academic integrity. For any questions related to disciplinary procedures, contact the Office of the Dean of Students

Learn more:

 

Back to top


How can generative AI be used in teaching and learning? 

In May 2023, the Senate Subcommittee on Teaching and Learning (STL) created a working group to address the use of generative AI at McGill. The working group drafted a report with key recommendations for the use of generative AI at McGill. The report also presents principles for the use of generative AI in teaching and learning.  

Many different strategies exist for using generative AI tools in course teaching. For example, instructors can use them to create sample texts for students to analyze; create images for presentations; design entire presentation materials; and generate sample practice questions. Strategies also exist for students to use generative AI tools to support their learning. For example, they can be used to brainstorm ideas; create images to support assignments such as presentations; and summarize documents. These resources offer more ideas for creating instructional materials and supporting student learning: How could AI be used for learning and teaching? and How AI can be used meaningfully by teachers and students in 2023

It is important to note that students need AI literacy skills to responsibly use generative AI tools. These potentially new literacy skills could include elements such as fact-checking and prompt engineering.  

Avoid AI detection tools  

AI detection tools are tools designed to identify content that is partially or wholly generated by AI. These tools (specifically regarding generated text) are unreliable and often inaccurate, a statement corroborated by OpenAI, the creators of ChatGPT. OpenAI states that their false positive rate is 9%, which is a similar rate found in other AI detection tools. A false positive is when the tool identifies text as being generated by AI when it was actually created by a student. False positive results misguide instructors and can create situations where students are wrongly accused of a violation that they did not commit, forcing them to defend work that is rightfully theirs. Such situations can result in a negative classroom climate and create unnecessary stress and anxiety for all involved. McGill therefore discourages the use of AI detection tools.

Course outline statements 

There should be no default assumption as to the use of generative AI tools. Therefore, McGill recommends that instructors explain to students in their course outline what the appropriate use or non-use is of generative AI tools in the context of that course. The use or non-use of these tools should align with the learning outcomes associated with the course. For this reason, instructors will need to write their own context-appropriate course outline statements. Below are four external examples to draw on: 

If you allow students to use generative AI tools in your course, provide guidance in your course outline for how students should acknowledge the use. Monash University provides useful examples, as well as links to APA and MLA citation guidelines. See the section entitled How students acknowledge the use of generative AI. 

AI and assessment of student learning 

Generative AI tools can be used to support many different types of assessment. As with the planning of any assessment of student learning, it is important to be intentional about the strategy. In this case, be intentional about either designing AI into your assessments or designing AI out of your assessments. This reflective process will encourage you to think through your assessments and determine if the integration of AI is appropriate (or not) for allowing students to demonstrate their learning. Designing AI into an assessment could involve having students generate the assignment with an AI tool, then critique the output, and reflect on the process. Designing AI out of the assessment could involve in-class writing exercises, where students create and share their ideas with other students in class during class time. Either way, the decision should align with students’ achievement of the desired learning outcomes. Monash University provides excellent considerations for assessment design.

Learn more:

Back to top