Teaching in the Age of AI | U of T Magazine - U of T Magazine
U of T Magazine U of T Magazine
Illustration depicting an AI prompt field, a student raising their hand, a close-up of an eye, grey rings of circles and three books on philosophy
Illustration by Melanie Lambrick

Teaching in the Age of AI

As machine learning reshapes higher education, U of T is preparing students to become ‘super humans’ Read More

This fall, Kyle Smith, a professor in the history of religions program, will ask his students at U of T Mississauga to take a side in the so-called Christmas culture wars. Part of an assignment for his third-year course “Christmas: A History,” it requires students to choose a “war zone” (for example, “Merry Christmas” versus “Happy Holidays” or real versus artificial trees), create a propaganda campaign with a recruitment poster and victory merchandise, and draft a “peace treaty” to win the ideological war.

What makes the assignment – and others he’ll be giving his class – unique is that students will use generative artificial intelligence for almost every task, including gathering intel on the opposing side’s tactics, crafting persuasive messaging and documenting how the technology produces false or prejudiced information to support their cause. Smith says this exercise can help students understand how AI generates results and avoid being misled by them. “In our AI world, every algorithm has a particular agenda and there is no objectivity. For students to recognize propaganda, they have to know how it’s created,” he says.

Smith’s assignment is one example of how U of T is adapting to AI’s rapid reshaping of teaching and learning. Students can use AI to conduct research, summarize concepts, create personalized study aids and produce original text, images, audio, video and computer code. Meanwhile instructors can use the technology to brainstorm lessons, create presentations and quizzes and provide feedback on assignments.

To understand the opportunities and risks, the university established an AI task force in spring 2024 consisting of six working groups. Their findings and recommendations are encompassed in the June 2025 report, Toward an AI-Ready University.

In our AI world, every algorithm has a particular agenda, and there is no objectivity.”

– Kyle Smith, history of religions professor at U of T Mississauga

The Teaching and Learning Working Group, composed of 11 professors and administrators from across the university, made several recommendations, including: redesigning assessments to emphasize human critical thinking; developing all community members’ ability to use common AI tools and critically evaluate their outputs; creating standards for AI tutoring systems; and sharing effective practices for using AI as a teaching tool.

“We looked at the whole 360-degree process of conceiving and delivering a course – both how we teach and what we teach,” says group co-chair Susan McCahan, associate vice-president and vice-provost, digital strategies. “We considered the kind of literacy students will need going forward, how you might go about designing a syllabus, assessment planning, and using AI for grading and feedback.”

AI’s ability to produce coherent, polished copy about complex topics that is often – but not always – accurately sourced can challenge instructors in assessing the extent to which students understand course concepts. McCahan has observed that, in response, some professors are assigning fewer take-home writing assignments (to avert the risk of AI-generated work) and instead are emphasizing in-class writing tasks, presentations and group work, where it’s possible to limit AI use. They’re also allocating more marks to these projects.

The working group also determined that courses should promote students’ AI literacy within their specific discipline and their ability to detect algorithmic biases in AI outputs, which can omit or stereotype women, racialized people and other minorities. By 2030, the report envisions learning activities where students use both human- and AI-generated content to build their knowledge, interact with AI avatars in group work and critique results produced by an AI.

This last activity is already happening at U of T Scarborough, says Karen McCrindle, associate dean of teaching and learning and the director of the Centre for Teaching and Learning. She recalls a professor assigning students to evaluate the merits of an AI-generated answer about a course topic, grade it and reflect on how they might respond differently.

“The professor’s idea is, let’s see together what we can learn from this tool, what its capabilities are and have a conversation about what it can and can’t do,” says McCrindle, who also sat on the working group.

McCrindle and McCahan both note that how AI is integrated into courses will differ by discipline, and affirm that the institution entrusts professors to apply it as they see fit. They encourage faculty members to experiment with the technology and to take advantage of the university’s extensive AI training resources.

Illustration of an open book surrounded by interconnected horizontal and vertical lines on a purple background
Illustration by Melanie Lambrick

“Not all faculty are using AI or are comfortable using it. Change isn’t easy for any of us,” says McCrindle. But she adds that instructors should become familiar with it so they can “set the ground rules in the classroom for which uses are okay and which aren’t.”

Toward An AI-Ready University also emphasizes the need to build students’ capacity to use AI in ways that advance their own learning. For example, McCahan says, using an AI chatbot to explain a course concept in a way that a 10-year-old would understand it is “deskilling,” as it does not support university-level learning. By contrast, using a query that elicits a more nuanced response is “upskilling,” because it makes you “mentally struggle” and elevates your comprehension.

For McCahan, the growing prominence of AI in higher education underscores the role of our distinctly human qualities in meaningful learning. She sees a need to encourage students to lean into their critical thinking skills, creativity, ethical judgment, emotional depth, relationship skills and life experiences – elements AI may emulate but cannot authentically replace – to stimulate their intellectual and personal growth.

“We have to help students understand that. We should not underestimate the value of our perspective as humans,” she says. “The more AI makes us superhuman, the more we need to be ‘super humans.’”

Your AI study buddy

U of T is piloting a course-specific chatbot that can guide students through materials and answer their questions

As generative AI chatbots become increasingly common study aids, the University of Toronto is planning to create its own such tool for use across the institution – if a pilot project is successful.

The Centre for Teaching Support and Innovation at the St. George campus is helping to implement a chatbot agent for inclusion in each participating course’s homepage.

Students will be able to use it to ask questions about course content, while instructors can rely on the chatbot to guide students through course materials and answer their questions.

Unlike commercial chatbots, U of T’s version will operate as a secure AI application, protecting instructors’ intellectual property and keeping conversation histories private – while offering students curated course-specific learning materials.

“It’s like a thinking buddy that guides students in learning about a topic using an authoritative source versus random nonsense on the internet,” says Jordan Holmes, one of the project leads and the centre’s senior manager of teaching, learning and technology.

The AI Virtual Tutor has already been integrated with more than 90 courses across the three campuses. With students providing anonymous feedback, the tool is being steadily refined toward the goal of implementing in courses where the instructor believes it would be valuable following the current academic year.

Most Popular

Leave a Reply

Your email address will not be published. Required fields are marked *

  1. No Responses to “ Teaching in the Age of AI ”

  2. elisa freschi says:

    “Not all faculty are using AI or are comfortable using it. Change isn’t easy for any of us,” says McCrindle."

    Perhaps it's not that we resist change per se, but that we resist bad changes?

  3. Tim Pritchard says:

    Very interesting for a 1960s grad. This leaves me with one question. You say that U of T is planning to create its own generative AI chatbot for use across the institution. In developing this tool, what steps is the university taking to ensure that this tool does not contain its own biases that could taint the output used in guiding students?

  4. Thomas Hughson says:

    Thank you for this article. U of T is ahead of the curve in dealing with AI in university-wide, comprehensive yet also discipline-specific ways. I hope U of T findings and curricular innovations can be known as widely as possible in the realm of higher education in the U.S. and Canada.

  5. University of Toronto Magazine says:

    Dick Swenson writes:

    I wish to congratulate Prof. Kyle Smith, who said, "In our AI world, every algorithm has a particular agenda... there is no objectivity."

    Please don't lose track of this. It is absolutely correct.

  6. Valerie Liske says:

    Wonderful to hear about your plan to teach critical upskilling and AI literacy. It would be great if this approach could be shared with other universities and colleges so a strong, consistent foundation can be developed, tested and used across institutions, including in other countries.

  7. Maria Weber says:

    It's easy to simply bury our heads in the sand and hope that someone will simply solve the AI conundrum in education. I find it refreshing to learn about how some educators are creatively using AI as a learning tool and an opportunity to teach essential critical thinking skills. Kudos to Professor Kyle Smith. We can only hope that more educators will catch on to this type of innovative teaching.