· Jay Dixit
Teaching Writing in the Age of AI
Five principles for educators
By Jay Dixit
Roy Lee is an undergrad at Columbia University. Or at least he was — before he got kicked out for cheating.
In a much discussed article in New York Magazine🔗, Lee describes how he used ChatGPT to write his essays instead of doing the work himself.
“I’d just dump the prompt into ChatGPT and hand in whatever it spat out,” Lee explains. “At the end, I’d put on the finishing touches. I’d just insert 20 percent of my humanity, my voice, into it.”
Why was Lee so unconcerned by the idea of casually delegating his college education to ChatGPT? Because, he says, most college assignments are “not relevant” because they’re “hackable by AI.”
The underlying assumption is troubling: that if AI can do something for you, then it’s not worth learning to do it yourself.
Lee may be the most unapologetic example of this mindset, but he’s far from the only student using AI to generate writing. As one writing professor told me in the midst of grading final papers: “I’m tired of giving feedback to robots.”
To be fair, students are responding to the world we’ve built for them — a system that rewards output over process, speed over intellectual engagement — even as the messy, iterative work of developing ideas, testing assumptions, and grappling with complexity remains invisible and unrewarded. Stressed, overworked, and desperate to get good grades, students are turning to AI — not to deepen their thinking, but to crank out assignments when time is running out.
To make coursework feel more meaningful and relevant to students’ lives — and harder to outsource to AI — educators urgently need to rethink the way they design assignments. That can mean in-class writing, or assignments that require students to connect course material to current events, apply theory to personal experience, and synthesize insights drawn from in-class discussions. It can also mean having students present their ideas out loud, defend their arguments in real time, and submit brainstorming notes, multiple drafts, and reflections that document the process by which their thinking evolved.
But it all starts with taking students’ concerns seriously — and offering a real answer to what from their point of view is a reasonable question: “Why not outsource my assignments to ChatGPT?”
To put it more bluntly: In an age when machines can write, what’s the point of learning to write at all?
Writing = thinking
It’s a question that deserves an answer. And as a writer, writing professor, and OpenAI’s first Head of the Community for Writers, I believe I have a good one.
Learning to write isn’t just preparation for future writing. We learn to write so we can learn to think.
Writing professor William Zinsser put it best🔗: “Writing is thinking on paper.” Teach students to write clearly and we’ve taught them how to think rigorously.
As Ezra Klein points out🔗, none of us know what society will demand from today’s students 20 years from now — or what kinds of work will even exist. AGI is an event horizon we can’t see beyond, and its arrival makes the future fundamentally unknowable.
But if the future is unpredictable, then surely the solution is to prepare for unpredictability — to help students become agile, adaptable, independent thinkers ready for whatever comes next. No matter what kind of world they step into, these foundational skills will always be valued.
The first day of my freshman year at Yale, then-Dean Richard Brodhead addressed our incoming class with a message I still think about today. We weren’t in college to absorb information, he told us — we were there to develop our capacity to think.
As Yale🔗 puts it:
“The essence of such an education is not what you study, but the result: gaining the ability to think critically and independently and to write, reason, and communicate clearly — the foundation of all professions.”
Whatever the future holds, the ability to think clearly and communicate with precision will always draw people in and create possibilities.
Writes and write-nots
The danger is real, and the cost of failure is high. If students never learn to write, they’ll spend their lives forever struggling to organize their thoughts, reason through complexity, and articulate nuanced ideas.
That’s what’s at stake if we don’t make sure students develop these capacities. As Y Combinator co-founder Paul Graham warns🔗, “The result will be a world divided into writes and write-nots.”
Is that outcome really so bad? After all, blacksmithing expertise faded with the rise of industrial manufacturing. Could writing be just another outdated skill?
Absolutely not, Graham argues, because writing isn’t just a skill — writing is thinking itself. As computer scientist Leslie Lamport puts it: “If you’re thinking without writing, you only think you’re thinking.”
“A world divided into writes and write-nots is more dangerous than it sounds,” Graham concludes. “It will be a world of thinks and think-nots.”
That’s why it’s still critical to teach students to write — even in a world where ChatGPT can generate vivid and graceful prose.
Students need our guidance
AI has transformed education, and students have had to figure out for themselves when and how to use it. Blanket bans just drive AI use underground, and telling students not to use AI at all doesn’t qualify as guidance.
So far, institutions have focused more on policing academic integrity than on supporting students — and professors are more interested in warning students away from AI entirely than in helping them use it wisely.
As a result, students are left to navigate these decisions without a roadmap. Some worry AI could weaken their thinking skills. Others spend their time scheming how to use it without getting caught. What they don’t have is a framework — a way to think through when AI supports learning and when it replaces it. Students deserve better.
As educators, we have the chance to shape how an entire generation relates to AI. Students need more than rules — they need context, guidance, and a clear framework they can use to make thoughtful choices.
Like any tool, AI can be misused. But used thoughtfully, it can also deepen understanding, sharpen thinking, and improve writing skills. Here are five principles that can help.
1. Look beyond academic integrity
Academic integrity is critical — but it’s just the starting point. When students focus more on “Will this get me in trouble?” than they do on “Will this help me grow?” we reduce learning to a compliance exercise.
Using AI thoughtfully begins with reframing the question. Instead of asking, “Is this cheating?” we should encourage students to ask themselves: “Does this way of using AI help me build my skills of critical thinking, analysis, and self-expression?”
As educators, our task is to help students move beyond compliance and encourage them to focus on genuine learning.
2. Define your own educational goals
Students everywhere are already using AI. Our job is to help them use it with intention.
College isn’t just about grades and credentials. It’s a unique opportunity to build the thinking, writing, and creative skills that shape who we are and how we contribute to the world.
Invite students to ask themselves:
What do you want to get out of your education?
Who do you want to become as a thinker, writer, and person?
A student might decide they want to learn to communicate complex ideas clearly — or become the kind of person who can walk into a room full of senior leaders and offer bold strategies that shape decisions. These are fundamentally different goals from “I want to get an A on this paper” — and they lead to different choices about how to use AI.
Remind students that college is the one time in your life when learning is your full-time job. Don’t squander the opportunity to become a rigorous thinker and leave smarter than you arrived.
3. Don’t outsource the thinking
There are ways of using AI that build intellectual skills and ways that bypass learning. It’s not that using AI inherently weakens thinking. The risk is that overrelying on AI can lead students to stop practicing the very skills they’re in school to develop. And practice is the path to mastery.
Every time students practice synthesizing information, organizing ideas, formulating arguments, and articulating their reasoning in ways others can understand, they’re building intellectual muscle.
Jane Rosenzweig, Director of the Harvard College Writing Center, recalls a student who used ChatGPT to generate arguable claims, supporting evidence, and outlines for his arguments. “In other words, he outsources his thinking to ChatGPT and uses it to create a product,” Rosenzweig explains🔗.
Encourage students to ask themselves: “Who’s doing the cognitive work — me or the machine?”
We need to shift students from merely asking AI for answers to using it to interrogate their ideas and deepen their understanding:
Outsourcing your thinking: “Suggest a nuanced thesis statement I can use for my essay.” ❌
Challenging your thinking: “Ask me tough questions to help me clarify my thesis statement.” ✅
Encourage students to not think of ChatGPT as a shortcut, but a thinking partner that questions, challenges, and probes, pushing them to articulate their ideas with greater precision.
4. Encourage students to do the hard parts themselves
Writing doesn’t just train thinking. It also builds grit — the ability to do difficult things even when it’s uncomfortable.
Students use AI to generate writing because it’s a fast and easy solution, explains University of Iowa instructor Sam Williams🔗. “But now, whenever they encounter a little bit of difficulty, instead of fighting their way through that and growing from it, they retreat to something that makes it a lot easier for them.”
Researchers call this “metacognitive laziness🔗.” If students get in the habit of giving up and turning to AI the second things get hard, they miss the opportunity to build crucial skills of persistence and focus.
As educators, we know learning happens in the struggle — that electric moment, standing at the threshold of insight, when students wrestle with uncertainty, search for clarity, and stretch to express their ideas. Psychologists call this “desirable difficulty” — the productive discomfort that fuels growth. Every time you push through that effortful, uncomfortable moment to articulate your thoughts clearly, your intellect grows more powerful.
So encourage students to do the hard parts themselves. Let them stare down the blank page. Let them push through the messy first draft. That’s the kind of work that builds thinking skills — and mental stamina.
5. Understand how your choices affect you
Used thoughtfully, ChatGPT can make you smarter — helping surface your best ideas, challenge your assumptions, and expose blind spots you may never have considered.
It’s all about how you use it. Think of AI use as a spectrum — from the counterproductive uses that outsource thinking to beneficial ones that actively strengthen thinking.
Outsourcing the thinking: “Suggest an outline and main points for my essay.” ❌
Automating the grunt work: “Format my citations in MLA style.” ✔️
Expanding your thinking: “In my own words, here’s what I understood from today’s lecture. Please point out gaps in my understanding.” ✅
A recent study in Nature🔗 found that students who use ChatGPT strategically perform significantly better than those who avoid it entirely. The key is using AI for “problem-based learning” — applying knowledge to solve real-world challenges — and for the kind of metacognitive reflection that deepens understanding.
The takeaway is clear: ChatGPT can make you smarter — if you use it to challenge and deepen your thinking.
Encourage students to move beyond “Is this allowed?” and instead ask, “Is this helping me become a stronger scholar, thinker, and writer?” That’s where the real opportunity lies.
AI is already woven into students’ lives. The only question is whether we guide them to use it thoughtfully — or leave them to keep going as they are.
Jay Dixit is former Head of Community for Writers at OpenAI and is the founder of Socratic AI🔗, an AI education consultancy.