How Generative AI Is Changing College Classes
There's a tidal shift underway in classrooms everywhere this fall. With the ability to generate believable, human-seeming writing and verbal reasoning—not to mention functional lines of complex code—artificial intelligence (AI) tools like ChatGPT and GitHub Copilot are disrupting age-old models of instruction, assignments, and assessment.
This flood of new technology is raising questions about how teachers should be preparing their students for a future in which using AI will likely unlock significant gains in creative productivity and offer a competitive career advantage, but it could also pose an existential threat to the value of the arts and truth itself. Of course, it's impossible to untangle all of these issues at this moment—time will tell. However, the pedagogical shifts educators make in the early days of this AI revolution could have a significant impact on how their students learn to live—and hopefully thrive—with these new tools.
With all this in mind, we spoke to four Berklee professors about how they were thinking about the role of AI in their classrooms, how it's changing what and how they teach, and how their students are responding to the new technologies.
Flipping the Classroom
Akito van Troyer, an Electronic Production and Design Department faculty member, teaches courses in creative coding for musicians at Berklee. He says that the coding capabilities of these AI tools have significantly shifted his approach to his assignments. "Any problem set I give to students, ChatGPT will be able to instantly solve it at this point," he says, "so I'm not counting on that." This means he's decided to take a "flipped classroom" approach.
A flipped classroom, as Berklee cultural studies professor Lori Landay explains, means "doing in class what can only be done when we're together in class and keeping presentation of material for homework." Students might watch a recorded lecture and review lesson materials for outside of class, and then spend class time applying what they've learned.
Van Troyer is actively incorporating AI into this approach: "Outside of class, they use AI tools to learn to teach the topic to others," he says. "Then they come to class and really have to understand the topic to teach to other students." In the flipped classroom model, the instructor is a facilitator, available to provide help, clarification, and expertise, but students take an active role in both learning and teaching the material.
"AI can really help with that in every single step. Definitely for getting the concepts right, but also for preparing the teaching materials, from the presentation to the exercise you can share with students to work on in class," van Troyer explains.
Any problem set I give to students, ChatGPT will be able to instantly solve it at this point.
A Boring Student
Landay, who teaches in Berklee's Liberal Arts and Sciences Department, intends to introduce a similar approach in her media studies classes, as a way to "foreground the thinking and synthesis skills that are applied working in class on tasks that are implied in the writing assignments." However, she continues to stress the importance of high-level conceptual thinking and ideation for her students.
If they want to use generative AI, she says, "they need to bring their ideas to it, rather than relying on it to give them ideas." This is partly because these are important skills for college students and artists. But it's also because—for now, at least—ChatGPT's not as good a writer as Landay's average Berklee student.
"During my sabbatical," she says, "I had the time to put all my assignments into ChatGPT to see what it would generate. Mostly it generated boring work below the capabilities of my students."
A Loyal Assistant
What AI tools can do, however, is serve as valuable creative aids, helping students (and their instructors) dispense with some of the more tedious tasks associated with writing, or coding, or learning, or teaching, the way calculators have done with, say, long division and square roots in mathematics.
Stephanie Kellar, an associate professor in the Music Business/Management Department, says that she's allowed students to use AI in her marketing classes to "help generate an initial list of product names and taglines." She also cites students in her class who use AI to help compose songs for a music production course, "since they are concerned with developing production skills, not composing skills," or who use it to help them compose professional emails, or who run their essays through Grammarly [an AI writing assistant], which "shows them how to actually improve their writing skills."
"I use it for all sorts of random tasks," Li says—"pairing up my students for cowriting, giving me prompts for song briefs, or helping me search for song titles that fit specific criteria, [for example]: 'Give me 20 hit songs that use a single adjective as the title.' It is a great help for making slides." She also says it's a "good research supplement for Google," though some of its facts need to be checked—"it can definitely fabricate truths." She encourages her students to look into how they can use the technology as well, "from proofreading their résumé [and] writing their artist bios to getting synonyms and rhyme ideas."
You either figure out how it can assist you or you'll be left in the dust.
Doing More with Less
In van Troyer's classroom, the potential of this new creative assistant could be transformative. "The gap between the imagination and the output is becoming smaller and smaller," he says.
"Students are going to be able to do more with less time while they're at Berklee. . . . AIs can take care of these minor detail programming tasks, and then students can focus on more high-level stuff." That means instead of writing one file in a program, "maybe I can ask them to do multiple programs that work together."
Landay, too, sees immediate potential for AI to help students close the gap between imagination and output—for example, "how they can turn a sketch into an illustration" using image generation AIs, even if they lack visual art skills. In her courses for the Game and Interactive Media Scoring major, she says she's "incredibly excited by how students can use AI tools to fill in a game design document . . . based on a structure and ideas they provide."
Ready or Not
All four professors agree that these tools are going to change the future their students are stepping into—like it or not, ready or not. And while there are tremendous benefits to using AI to our advantage, there are also great risks in being caught unaware or unprepared as the technology becomes widespread.
"AI is here to stay," Kellar says, "but I also caution [my students] not to become too reliant on it. It would be problematic if someone queried your writing and you couldn't answer because it was AI-generated." However, she says her students take the attitude that "you either figure out how it can assist you or you'll be left in the dust. . . . [It's] akin to record labels who thought a little data compression algorithm couldn't hurt them until it turned their hard goods into binary code."
This pace of industry transformation helps explain why Li is particularly focused on ensuring her songwriting and production students don't fall behind the times. "I am concerned about [AI] not making its way into classrooms fast enough," she says. "I think we need to first engage with it before we can understand its risks and downfalls."
Van Troyer echoes the career implications of new generative AI tools: "It's really important for all teachers to think about this. . . . There's no way for us to stop students from using this. In fact, it's better if they use these tools, because it's going to matter for their jobs."
Is making music, or art, or developing games like math, which a calculator can do? Or is it what only a human can do?
A Calculator for Creativity?
Landay, for her part, acknowledges all of the positive use cases for this technology, but expresses some reservations as well. She says that she's going to continue to "start from a position of trusting the student" when it comes to issues such as plagiarism—but also that she will "undoubtedly be furious" if she discovered she'd spent her valuable time "carefully grading an essay written by ChatGPT. I feel like ChatGPT should grade what ChatGPT writes, not me!"
More deeply, Landay's reservations concern "a wholesale challenge to truth, in and out of the learning environment." Will authentically human expression continue to matter in a world where machines impersonate people so convincingly in deepfake videos and visual art and writing that we can no longer trust what we hear, see, or read?
Digital policy solutions such as "watermarking and labeling, so that people can see that content has been altered" and citation of all sources, "including AI," might get us part of the way there, Landay says. Ultimately, though, we humans are still left with some of that old-fashioned "thinking and synthesis" she continues to expect of her students.
"The question comes down to this: Is making music, or art, or developing games like math, which a calculator can do? Or is it what only a human can do?"
It's a question ChatGPT won't be able to answer for us.