Table of Contents

Should I use AI in academic writing?

Writer John Warner on AI in academic writing

How academics are using AI now

ChatGPT as a thinking partner

Even with AI in academic writing, the process remains personal

A new kind of literacy emerging with AI in academic writing

Conclusion

AI is going to change everything, but it’s going to be a partnership between humans and AI, not a competition. ~ Mark Cuban, American entrepreneur
  • “Should I use AI in academic writing?” It’s a question I hear often from clients these days, often referring to ChatGPT which many of their students are using. Whether it’s professors struggling to create assignments for students or facing revise-and-resubmit for an article, department chairs grappling with faculty evaluations, administrators drafting documents for a new required report, all are asking this question and wondering about the legitimacy of using AI tools. Reactions about using AI range from skepticism to hopeful curiosity. Academic professionals juggle multiple priorities with limited time and many hope AI tools can make writing easier and quicker. But there is a deeper question underneath: What does it mean to write in the age of AI?

    Writer John Warner on AI in academic writing

    Academic writer John Warner is a columnist for Inside Higher Ed and the Chicago Tribune as well as the author of several books. Warner addresses the question of AI in his most recent book, More Than Words: How to Think About Writing in the Age of AI (2025) invites us to explore. Warner is known for his student-centered approach to writing and pedagogy and doesn’t reduce the AI conversation to binaries—good/bad, real/fake, helpful/harmful—which seems to be the general tenor of many discussions about AI right now.

    Instead, Warner invites us to think deeply about what writing is for, who it is by, and how tools like ChatGPT can serve—not replace—human thinking. He asks, “Do we just want to be prompt engineers and send AI answers out into the world without critical thought?” In his view, ff we choose to do this, we are letting machines do the hard work of thinking for us. We need to understand what he calls the “root value” of learning and using writing to learn. Not feed more pablum into the internet ecosphere.

    How academics are using AI now

    Though ChatGPT seems to be the generic name most equated with AI, there are at least 15 other models that do the same kind of generative AI work, including Anthropic’s Claude, Microsoft’s Co-Pilot, Google’s Gemini, and Perplexity. The surveys cited below do not specifically identify which AI tools are being used, but do provide some statistics about how academics are using AI now:

    • Intelligent.com surveyed 1,000 US high school, undergraduate, and graduate teachers/professors in April 2023 (ChatGPT launched in November 2022) found 98% say they frequently or sometimes use ChatGPT—to draft lesson plans, student feedback, emails, and recommendation letters.
    • In July 2024, Elsevier/Times Higher Education released a survey of over 2,200 global researchers, with 31% of faculty using ChatGPT in their professional work—from drafting text to brainstorming ideas. STEM instructors outpace humanities instructors in using AI.
    • Tyton Partners surveyed 1,800 higher education instructors released June 2024 in their “Time for Class” report found61% of faculty reported using AI in teaching, but 88% did so minimally. Common uses included generating teaching materials and providing feedback.
    • A global meta survey with a matrix summary of findings by the Digital Education Council released in March 2025 found that 86% of students use AI in their studies, with 54% using it weekly and nearly one in four using it daily.
    • The same DEC survey reported 93% of higher education staff expected to expand their use of AI for work purposes in the next two years.

    These numbers reveal a significant reality, that many professors and educators are already using AI tools regularly, especially for administrative and teaching tasks. It’s currently unclear how many academics use AI to write up research results, though possibly more than we might assume based on the numbers above. It seems clear, though, that the number using AI in academic writing is going to grow.

    Warner cautions against handing over generative thinking to AI without scrutiny or editing. Regarding the statistics cited here, I checked all the sources cited by ChatGPT personally and did not include the ones I felt were inaccurate or did not provide a release date for the data. As authors, we should be responsible for questioning where the data come from, what is assumed, and whether the source aligns with our intent.

    ChatGPT as a thinking partner

    I view ChatGPT as a partner in writing, not as a replacement for my own writing. For instance, it was my own idea to write a blog that incorporated some of John Warner’s ideas because I have been reading his work for years, he’d recently spoken at a local public venue, and I ran across an interview between Scott Carlson and John Warner published in February 2025 in the Chronicle of Higher Education. This string of occurrences prompted me to ask, “What would I like to say myself about writing in the age of AI?”

    In working with ChatGPT and other AI program, have constructed various prompts. I find AI extremely useful for generating an outline, giving me ideas for good source material, or increasing clarity around the question I am asking. Asking “What perspectives am I overlooking here?” returns concrete suggestions—ideas and angles that may not have occurred to me on my own. In that sense, it functions like a trusted colleague offering fresh eyes on a manuscript.

    Even with AI in academic writing, the process remains personal

    In More Than Words, Warner insists the core thinking remains ours. I echo Warner’s call to return to first principles of writing. AI can spark ideas, but I must still ask myself: “Does this piece reflect my intent? Does it reflect what I believe? Does it align with my voice? Does it serve my academic and general readers?”

    Like Warner, I watch the ads for AI put out by the big tech companies and find myself thinking, “Just how stupid do they think we are?”

    For example, one ad for the iPhone 16 Apple intelligence tool shows a sloppy guy with a blank computer screen plopped down in his chair, pulling tape out of a dispenser and swinging a chain of paper clips around, until he must answer an email regarding a project’s status. He speaks the answer in grade school mildly offense language into his phone, then has Apple Intelligence clean up the email to “professional” style to submit.

    The resulting email is nothing if not bland, while the user congratulates himself on being a genius for using the AI tool. Really!? I think most writers know they can do better than the ad suggests and would prefer to think for themselves even for email communication. (As an aside, Apple Intelligence AI roll-out has not gone well—users have reported software glitches, slow performance, and not working as well as hyped.)

    Writing, for me and other writers I know, is how you figure out what you believe and how to frame it so others can understand your subject or point of view. In the interview with Scott Carlson, John Warner says he has become less interested in using AI tools over time, because a summary of information from a variety of sources is less useful to his thinking deeply about a subject than one single source espousing a particular point of view. As long as out thoughts remain our own and we are allowed to think freely, we should keep writing.

    A new kind of literacy emerging with AI in academic writing

    In More Than Words, Warner calls this moment an “epistemic disruption,” a shift in how we think about knowledge creation. Learning and writing are not just about creating the best engineered prompts. We can take the results and spend time reading source material more deeply and help ourselves think more critically as we embark on a writing project.

    Instead of banning AI, we can model its critical use—teaching ourselves and our students how to ask good questions, evaluate AI suggestions, and claim ownership of our final drafts. For students, providing an AI-generated draft to revise along with requiring reflective commentary, can be a powerful exercise in critical thinking. Check out my previous post on six strategies to use when letting students use tools like ChatGPT.

    Ultimately, incorporating AI into writing routines isn’t just about efficiency. It’s about a new kind of academic literacy—developing the discernment to know when a tool can help, why it’s being used, and how to stay rooted in our own thinking while using AI tools. If we do not, we risk a circular regurgitation of AI writing by drawing on old material. Computer scientists submitting articles to journals have already run into the problem of AI writing, editing, evaluating, then rewriting the material and bringing that writing back into the world as if it were new material. The critical thinking of human authors is important to generating new ideas.

    Conclusion

    Tools like ChatGPT are here to stay. Using AI to brainstorm, to temporarily help you get unstuck, and play with different perspectives you had not previously considered, is an excellent way to explore just how useful these tools can be for you. Our challenge using AI in academic writing  is to integrate them thoughtfully and not shortcut the hard work of thinking. Rather than focusing on the product (an essay, an article, a book), we should stay focused on the process of inquiry. We must do the hard work of thinking for ourselves.

If you still need help with your writing, schedule your free 20-minute, no-obligation session with Hillary.

 

Tags: , , , , , , ,