AI Could Save Education (But Not In the Way You Think)

For those who weren’t paying attention, November 2022 may have come and gone like any other month, but secondary teachers felt the tremors. Researchers at OpenAI had been developing a generative AI system for years, but it only took five days for ChatGPT to reach one million users. It reached 100 million users in two months. In the blink of an eye, students had a new power: the ability to partner with artificial intelligence to craft seemingly informed thought and coherent written work–without doing any of the thinking or the writing themselves.

Since that moment, we have quietly entered an AI arms race. ChatGPT has been integrated into Bing’s search engine. Microsoft launched Bard. Human speech can be deepfaked based on just a few seconds of original audio. Dall-E can create novel visual art and photographs from a few language prompts. Just this week, one of my students mentioned that she loved Rihanna’s “new stuff.” It’s been six years since Rihanna released an album, so I was intrigued. It turns out that she wasn’t listening to Rihanna herself, but rather an AI-produced song that cloned Rihanna’s voice so she could artificially cover Drake and Beyoncé. I interjected, “But that’s not Rihanna.” The student looked at me like I didn’t get it. “I love Rihanna,” she said. “It doesn’t matter to me.” This is the world our students are living in. And this is just the tip of the spear. 

One of the organizations paying close attention to these developments is The Center for Humane Technology. Building on their documentary, The Social Dilemma, a film about social media’s amplification of societal harm, its co-founders, Aza Raskin and Tristan Harris, have recently turned their attention to generative AI. In a recent episode of their podcast, Raskin shared that just a few years ago, these tools were novelties. He thought “‘Oh, this is neat, this is fun. This is a toy.’ And what's really happened in this last year is that it's gone from a cute toy to something that feels very powerful, very usable, very here right now.” Harris, a cogent thinker who isn’t prone to hyperbole, calls this moment “a new paradigm for humanity.”

You had better read this piece quickly. Given that generative AI’s growth curve is exponential, it’s likely that it will be out-of-date in a matter of months, if not weeks. Raskin and Harris would go further. At a recent convening of AI experts, they explained the mechanics that will likely send the growth curve into “double-exponential growth.” (Their 45-minute presentation is both important and intensely urgent.) Blink, and the tech landscape will have changed. But this is no time to blink. As a teacher, I’m acutely aware that the tasks that our institutions have asked students to do, for generations, have now been relegated to a workaround. It certainly seems like the practice of teaching and learning itself is at stake. 

Generative AI isn’t infallible, and much has been written about its inaccuracies, hallucinations, and tone deafness. Pieces have explored how it can increase inequality by concentrating the biases that already proliferate in the Wild West of the internet. Others go further, and explore its capacity to amplify and accelerate societal breakdown. And yet despite its myriad flaws, it is a powerful publicly-available tool that can supplant student writing. Each new version does that work with greater proficiency. So where does that leave teachers? Where does that leave teaching itself? 

Right on the heels of the ChatGPT, a vociferous debate unfolded among educators. On one side of the debate are those that are deeply unnerved. They argue that AI gives students the wrong kind of power, weaponizing a simple algorithm to distribute a cheat code to the masses. Unless we wall ourselves off from these forms of AI, we will never again know whether the work we are reading is an authentic student product. Practically, many have argued for banning ChatGPT and other generative AI tools outright. 

On the other side of the debate are those who believe that we should fully embrace this new tool. Some educators, such as Ethan Mollick from UPenn’s Wharton School, have even gone so far as to require the use of ChatGPT. Why deny our ability to enhance our performance and improve our efficiency, he argues? And why play ethical whack-a-mole? Some people will always find ways to cheat. (Before ChatGPT, students could already enlist Kenyans to write essays for cash; in a twist of fate, the algorithm may put these ghost-writers out of a job.) The best way to ensure a level playing field, those on this side argue, is to welcome and integrate this new power.

While they are seemingly opposed, these two sides actually agree on a fundamental premise: that the way we have structured teaching and learning itself is unassailable. School is a place for students to learn content, and then demonstrate their knowledge. Whether we raise the ramparts to keep out the AI barbarians, or whether we integrate AI’s superpower, these two sides agree about the underlying mechanics of education – mechanics that haven’t fundamentally changed for the past 150 years. 

For the vast majority of students, educational innovation has not trickled down. If a 19th century educator were transported to a classroom today, most of it would look familiar. New technologies haven’t delivered that transformation either; despite their promise, there has been little overall movement in terms of what teachers expect, what students do, and what school is for. The internet makes the Great Library of Alexandria seem like a book cart, but we still ask students to harvest information and then show that they know it. Today’s students may have laptops instead of slates, but they are still writing five paragraph essays. Paperless classrooms may have streamlined workflow, but students remain on a neverending treadmill of content and coverage. 

To survive the paradigm shift that generative AI will catalyze, the education system needs to evolve, and not merely by adding more technological tools. We need another way. For the past twenty years, I have been curious about how to build resilient, learner-centered systems. In the age of AI that goal is more vital than ever…which is why it has been gratifying to identify the contexts in which education is safe from this technological storm surge. One of these places is the Burlington City & Lake Semester (BCL), a program I co-founded and in which I now teach. BCL is a program in which public high school students step away from their conventional schedule and explore the city as both classroom and curriculum. 

Another place is in Kentucky, where I have been working with 2Revolutions, to help teachers integrate real-world learning into their practice. In Louisville, teachers are engaged in a generative Community of Practice. That momentum has recently expanded to 15 Ohio Valley counties, where plans for further innovation are currently unfolding. This work can happen anywhere. In a small city in Vermont, in a sprawling urban district, and in a consortium of suburban and rural counties, students’ learning is more and more connected to the world. 

Why does it feel like ChatGPT has the potential of upending traditional school models, whereas in BCL or in 2Revolutions’ Kentucky partnerships it feels irrelevant? The answer is that real-world learning invests in something that no AI can scan and collect from the internet: lived experience itself. In places where content knowledge remains the primary metric of student learning, ChatGPT, Bing, Bard, and their fellow algorithms are rapidly eroding trust and threatening the status quo. Ironically, however, the rise of artificial intelligence offers schools everywhere a unique opportunity to demote mere knowledge, and to elevate something far more powerful: understanding.

This isn’t the moment to ban AI out of fear, nor is it the moment to embrace it in order to amplify students’ ability. Instead, this is the moment to shift educational practices in a way that makes the debate about AI’s threat obsolete. Broadly, we need to transform a massive system built on content delivery into smaller systems based on lived experience. What follows are a collection of approaches that deepen and re-humanize both teaching and learning. 

Real-World Experience

Every place has relevant, complex, interdisciplinary issues to explore. Conventional public education curriculum is often completely divorced from the place that students’ live. Re-connecting learning to the real-world shifts the purpose of education from “You’ll use this one day,” to “This is happening right now.” Issues are unfolding everywhere, all the time: in small towns, in city neighborhoods, upstream and downstream. What is real is a richer curriculum than any textbook could contain, and more relevant than any teacher-designed simulation. The world itself is far too fascinating for us to spend a decade “preparing students” to enter it. They are ready to learn in it, and from it, right now. Interestingly, Kentucky law has rolled out the red carpet for this approach for decades. With more recent initiatives, including United We Learn, the sky is the limit for Kentucky’s educators. At their recent Celebration of Learning, Louisville teachers who took part in 2Revolutions’ Community of Practice shared inspiring projects. In one, students applied math concepts to renewable energy design, and presented their findings to three engineers and the school district’s Director of Facilities. In another, students studied environmental injustice and then volunteered to help distribute much-needed equipment to Eastern Kentucky communities devastated by flooding. In another project, students collected local oral histories from the era of segregation, and mapped them using ArcGIS. In each of these examples, authentic insight is held by the learners. Generative AI can answer questions for which answers are found on the web, but can’t make meaning of real-world experience.


Open-Ended Inquiry

There is one question that can never be answered with a web-search. It’s a question that pervades the learning environments of early childhood and Kindergarten: “What am I wondering right now?” Studies have shown that although inquisitiveness is a natural human trait, it atrophies over time. However even though conventional schooling may erode curiosity, human inquiry is irrepressible. When we ask students what they are curious about, their responses are boundless. This natural fuel can be applied to inquiry cycles, in which students begin with open-ended questions and end with different, deeper, more nuanced open-ended questions. Expansive curiosity is profoundly human. AI can make connections, it can even make propositions, but it can’t wonder. No algorithm can replace the generative growth of a curious mind


Authentic Purpose and Audience

A colleague once said something that has stayed with me for years: “There’s nothing more motivating than hearing someone say ‘We need you.’” Most of the time, students wait to hear these words until after they graduate–if they hear them at all. What a waste. There are innumerable opportunities right here, right now, for young people to be taken seriously as partners. Our task as educators is to identify those opportunities, and then to create the conditions for students to step in, and step up. Two of the most powerful levers for deepening learning and real-world impact are easily available. The first is authentic purpose – when learning serves a real need. Local needs are close at hand. To find them, we need to shift from designing backwards from student outcomes to designing backwards from potential impact. When you add an authentic audience, growth levels up even more. Students are used to having teachers be the only adult audience…but when professionals, experts, decision-makers, or everyday citizens and neighbors are present, magic happens. Discourse elevates, insights flourish, and the quality of listening deepens. The only people who can make meaning of the experience are the ones who are present. AI is even less equipped to offer anything of value, because all of the wisdom is in the room.  


Collaboration with Community Partners 

For 99% of human history, if you wanted to raise a young person to be a successful adult, you offered them time with successful adults. Today, students spend their school years in classrooms, and the only successful adults they are in contact with are…us. Collaborating with community partners reconnects students to a world of adult professionals whose jobs, experiences, dilemmas, successes, and questions would otherwise be invisible. We need to create the conditions for authentic connection. No algorithm can meaningfully build a  relationship with a community partner. When students integrate the insight from those partners into their writing and projects, their learning comes to life. No “natural language processing system” can quote a person whose point-of-view was gathered in authentic conversation, let alone apply their perspective when synthesizing larger ideas. Human context is too rich to be synthetically processed. By increasing relational richness, we empower learners to share what emerged from those connections. 


Assessment for Learning

ChatGPT-4 can score a 1410 on the SAT. It gets 5’s on ten different AP tests. It can hit the top 10% on the bar exam. In a feat that baffles even AI experts, it has taught itself research-level chemistry. And yes, it can also write cogent, compelling essays on just about any topic. With an extra prompt, it can do it in the style of Emily Brontë, or Carl Sagan, or Dr. Dre. (Apparently, AI can also sing like Rihanna.) Even before the double-exponential curve kicks in, this technology can already complete the tasks that teachers routinely assign as summative assessments. Others have written about how to adapt assignments to outsmart the algorithm. I will join the small chorus, and argue that assessment itself needs to change. 

For generations, teachers have evaluated how close students are to a teacher-centered standard. However, it is precisely these standardized and reliable metrics that are the most vulnerable to AI. There has never been a better moment to reinvest in validity. We need to redesign our assessment system to focus instead on learning. This may sound abstract and unattainable, but any of the following high-leverage approaches can easily be used in conventional classroom settings. 

  • First, consider investing in metacognition. In BCL, students frequently pause to reflect on what they are learning, and to surface insights, questions, connections, and tensions. There is no substitute for making learning visible, and it’s best to make it visible as often as possible. When students journal, they share their writing with peers. When a unit ends, we build in time for synthesis. The more opportunities students have to name what’s alive for them, the better. AI may be able to research, cohere, and report, but it can’t hold up a mirror to itself. There is no greater measure of learners’ growth than learners' own reflections on that growth. This is a mindset, and no matter how advanced AI’s “mind” becomes, only humans can cultivate self-awareness.

  • When students first join the Burlington City & Lake Semester program, they arrive having been taught to write with austere, sanitized objectivity. So it’s countercultural when we introduce a unique form of analytical writing that asks students to actively synthesize both independent research and real-world experiences. They are encouraged to make substantive connections between articles, videos, essays, etc. and real-world observations, encounters, interviews, and activities. In short, their experience itself becomes an essential text. Generative AI can synthesize information from hundreds, even thousands of sources, but it can’t have experiences. ChatGPT could research housing policy, for instance, and produce an interesting analysis. It could even include a few juicy quotes. But only a human could  interview someone in their neighborhood who vulnerably shares that the city’s housing crisis is forcing them to move. The closer students’ work products are tied to their actual experiences, the richer the result, and the greater the distance between them and their smart-but-sanitized digital competitors.

  • Finally, it’s worth investing in personal meaning. AI systems can easily describe the what – but humans remain uniquely positioned to get to the so what. Beyond merely explaining the facts or details, we should ask students to explore what information or knowledge means. Anyone can write a milquetoast essay about a well-known topic – even ChatGPT. In BCL, students are actively encouraged to integrate their own perspective and values. They are even encouraged to use the personal pronoun. An entire criterion of BCL’s proficiency-based [competency-based] writing rubric is devoted to Personal Voice & Perspective.

    I often tell my students, “The best writing is the piece that  only you could write.” After years of removing themselves from their writing, it can be awkward to rediscover their voice, but once the stone is rolled away, students feel more and more comfortable making personal connections. This is meaningful for me as well. At the end of the day, what I care most about is how learning has changed the learner. When students wholeheartedly reflect on their learning and share how it has changed their perspective, their values, and their worldview, they are re-humanizing the act of writing. Generative AI doesn’t have a worldview, and this is exactly why we should center our learner’s values.

When the Burlington City & Lake Semester was designed, AI was a novelty, not a competitor. The program wasn’t designed to defy or counter a technological threat; it was designed to increase the depth and complexity of learning and to increase students’ impact and empowerment. These elements were always valuable, but now that our society is staring into the technological abyss, they feel essential. The good news is that they can be applied at a variety of scales. Even if a school district can’t design and implement a fully-fledged program like BCL, any investment in relevance and authenticity will pay dividends. The Kentucky teachers who are engaging in 2Revolutions Communities of Practice have seen this firsthand. Even small prototypes can have an immediate impact. When we quilt these smaller efforts together, we can fundamentally change what school can be.

Another option, of course, is to do nothing. We could wait, and see how this all plays out. But the longer we cling to a teacher-centered delivery model and its standardized outcomes, the more likely it is that both student tasks and traditional assessments will become obsolete. Instead, we have an unprecedented opportunity to recenter learning itself. This is within reach. Right inside the source code of place-based, experiential learning are the tools that can help us all weather the coming storm. We just need to make teaching, learning, and assessment so authentic, personalized, meaningful, and purposeful that AI is only useful when it’s time to polish grammar and spelling.

Dov Stucker, 2Rev Coach

Dov Stucker is a teacher, community organizer, and innovator. He is a co-founder and Lead Teacher of the Burlington City & Lake Semester (BCL), a place-based experiential program in which Burlington (VT) High School students use the city as both classroom and curriculum. He also supports district-wide efforts to expand opportunities for deeper learning and flexible pathways.

Previous
Previous

Nurturing Change during Challenging Times: Centering Young and Adult Learners

Next
Next

We’re in a Race? Like It or Not, Yes. The State of Education in Arkansas