A New York magazine article titled “Everyone Is Cheating Their Way Through College” made the rounds in mid-2025. I think about it often, and especially when I get targeted ads that are basically variations on “if you use our AI tool, you’ll be able to cheat without getting caught.” Suffice it to say it’s dispiriting.
But the problem is not that students are “using AI.” I “use AI,” and it’s something everyone needs to learn how to do. The problem arises when students represent AI’s work as their own. At a fundamental level, the question of academic integrity and the use of artificial intelligence in higher education is not technological. It’s ethical.
I love generative artificial intelligence and use it for many, many things. Workouts. Recipes. Outlining and revising articles and lectures. Multiple-choice questions. Getting the code I need to tell R to turn a spreadsheet into a bunch of graphs. Tracking down citations. And much more. The possibilities are endless. Used wisely, it multiplies productivity. Used foolishly, it multiplies folly. Debates about academic integrity and artificial intelligence force us to really reckon with who we are and what we’re doing.
The debate has split into unhelpful camps. One compares AI to a calculator. Another sees AI as the end of human thought. Both miss the point. The “just a calculator” crowd ignores how calculators and related software tools, as useful as they are, have relieved us of many of the burdens that come with thinking quantitatively. “It’s just like a calculator” is (kind of) true, but it’s not reassuring. Knowing which buttons to press to make a parabola appear is not the same thing as knowing what a parabola actually is and why it’s meaningful. The “end of thought” crowd ignores how generative AI is a powerful tool that can be used wisely. Is it an assistant? That’s great. Is it a substitute? That’s not.
The problem, though, is not the tool. It’s the user. People can use AI wisely or wickedly, just like they can any other tool. In the hands of Manly Dan from Gravity Falls or Paul Bunyan, an axe is a tool used to fell trees and provide shelter. In the hands of Jason Voorhees from the Friday the 13th horror franchise, it’s a tool for something else entirely.
In 2023, just as we were meeting and getting to know our new AI overlords, I wrote an article responding to the cynical student asking, “when am I ever gonna use this?” about the humanities and other studies that aren’t strictly vocational. My answer was (and is) “literally every time you make a decision.” Why? The decisions you make are a product of the person you are, and the person you are is shaped by the company you keep. Studying history, philosophy, literature, economics, and the liberal arts more generally is an exercise in keeping good company and becoming a certain kind of person: one who has spent sufficient time grappling with the best that has ever been thought and written to be trusted with important decisions. It is to become a person who has cultivated the art of judgment.
It’s an art we can practice poorly in a world where it’s trivially easy to outsource our thinking to ChatGPT and Gemini. Here’s an analogy. If you’ve never seen the movie Aliens, drop everything and watch it. It’s a classic among classics. If you have seen it, consider the end of the movie, when Sigourney Weaver’s character, Ellen Ripley, dons a P-5000 power loader suit to defeat the alien queen. She uses a tool that amplifies her strength, enabling her to accomplish what would otherwise be impossible.
The way many students use AI is much like wearing Ripley’s power loader suit to the gym. You might be able to “lift” 5000 pounds in the power loader suit, but it’s a mistake to think the suit is making you any stronger, a laughable self-deception to think you could lift 5000 pounds without it, and a laughable lie to anyone you’re trying to deceive into thinking you can lift 5000 pounds. When you hand in work that’s mostly AI-generated, you’re not building muscle, learning to lift, or getting stronger. You’re racking up huge numbers while your muscles atrophy.
Sometimes, of course, using AI is like having a spotter when you’re doing squats or bench press. I use AI in the gym as a trainer of sorts that tells me which exercise to do next. That’s one way to use AI, but the way too many students use AI is like going to the gym and having the AI tool–the power loader suit–lift the weights for me.
Tools like ChatGPT, Gemini, Grok, and Claude should free up our time and energy to do higher-order work, not hide the fact that we can’t. Technology has made me significantly more productive: I dictated the original version of this essay into Google Docs on my phone using wireless earbuds, and then revised it using Gemini and Grammarly. What’s the difference between that and submitting AI-generated work? Using dictation tools and AI to generate and clean up an essay like this is like using Ripley’s power loader to move heavy stuff. Using AI to create text and trying to pass it off as your own is like using Ripley’s power loader suit to fake a workout.
I thank ChatGPT, Gemini, Grammarly Pro, and GPTZero.me for editorial assistance.
READER COMMENTS
steve
Jan 15 2026 at 12:59pm
The problem is that students can use AI to do their work and, depending upon their course or school, the risk of being caught is low. I have no doubt that many kids using AI as an aid but are still doing their own work, but we know many are using to the entire assignment. At this point the only recourse is to hope that another AI is somehow able to tell that an AI did the student’s work.
I hope people realize that this stuff starts well before college. High school students are doing this and before there was AI students used chat rooms and online “friends” to do their graded homework. Personally, I think graded homework should be minimized.
Steve
Tyler Watts
Jan 21 2026 at 9:47am
Very well said, Art! I was thinking about writing something along these lines, but now I’ll just refer people to this article.
I too am a big fan of AI and use it as a very effective research aid, but fortunately for me I came of age before AI and learned to write well through a rigorous liberal arts curriculum at Hillsdale College. I am encouraging responsible use of AI tools, but I’m warning that over-use of AI, e.g. farming out an entire project, soup to nuts, to AI, is actually stultifying and will hurt you in the long run.
To extend your Aliens metaphor: using AI to create written assignments is like an weightlifter using steroids. Impressive short-term benefits (instant paper, minimal effort!) come with bad long-term side effects: roid rage and myriad health problems for steroids, for AI addicts, permanent job/career disadvantage once it’s exposed that you are not an independent thinker. The AI abusers will not be able to think on their feet, communicate (written or oral), and will be qualified for little more than AI-assisted clerical duties, which themselves will be most likely to be supplanted by the next round of AI developments.
Let’s hope for (and work towards) a renewal of true liberal arts, broad-based education! I have a strong sense that one of the things AI will never match or replace is the well-rounded, historically and culturally well-informed, critical-thinking student. That, and plumbers.
🙂
Comments are closed.