I was recently added to a WhatsApp group full of friends and former colleagues. A lot of folks in there are in the middle of job searches, so the group was set up as a way to share leads, offer support, and help each other out. By the way, reach out to me if you‘d like to be added to the group, They’re a great bunch of folks and I’m sure you’ll enjoy meeting them.
One of the first things people did was introduce themselves. And that’s where something odd started to happen.
Some intros were clearly polished; thoughtful, well-written, personal. A few mentioned that they used GPT to help shape what they wrote. Then came a string of replies that made a point of saying the opposite. “No GPT here.” “Wrote this on my own.”
I had already introduced myself by the time this all happened and I’ve not quite been able to articulate why I found this conversation disappointing. Not because I don’t know what to say, but because I’ve been sitting with the weirdness of that moment. It wasn’t about who used what tool. It was the tone. A quiet performance of purity. As if using GPT somehow made your words less real.
And to be honest, I think that’s the wrong way to look at it.
The truth is, I am openly using GPT in all sorts of ways. Sometimes to rough out a paragraph. Sometimes to reword something I already wrote but don’t quite like. Sometimes just to get past the blank page. That doesn’t make my writing less mine. If anything, it helps me hear it more clearly.
So when I see people drawing a line, not just saying they didn’t use it, but implying that made their words more authentic, I can’t help but feel a little unsettled. Not because anyone has to use these tools, but because of what it signals when you don’t, and you make sure to say so.
It reminded me of a moment from the early 2000s. There was this whole wave of “I don’t watch TV anymore” energy. People weren’t just skipping shows but they insisted on announcing it like a badge of honor. Like tuning out made them better. But all it really did was shut down the conversation.
This feels similar. Not using GPT is a choice. But using it doesn’t make you less thoughtful, or less original, or less human. It’s a tool. The work is still yours.
When people talk about using GPT, the conversation too often gets framed as either/or. Either you wrote it yourself, or the machine did it for you. Either you’re original, or you’re outsourcing.
But that’s not how it actually works.
The best use of tools like this is active. You put something in, you shape what comes out. You edit. You redirect. You delete what doesn’t sound like you and keep what helps you say it better. That’s not outsourcing. That’s collaboration. This is the entire premise we in the media industry are repeatedly saying about AI as an assistive technology across all forms of work int he supply chain. Why cant we allow the same collaborative nature in something so simple and fundamental as written communications?
And honestly, most of the time when people say they don’t like what GPT produces, it’s because they haven’t really engaged with it. They tried it once, didn’t love the results, and left it at that. But writing is a process. So is using a tool well. You don’t judge a camera by a bad snapshot.
This isn’t about turning creativity over to a machine. It’s about having a new kind of partner in the room and sometimes in your pocket. One that’s fast, tireless, and occasionally surprising. It doesn’t replace your voice. It helps you hear it more clearly.
That said, I’m no Pollyanna about AI. GPT can absolutely be reductive, repetitive, or just off the mark. And when that happens, it’s my job as the human in the loop to discard what doesn’t work and steer it in a better direction.
There’s also a bigger issue here, one that hasn’t been called out enough. Not everyone in that chat group, or any professional space, is working with the same baseline. English might not be their first language. Or they might be neurodivergent. Or just in a moment where finding the right words feels harder than usual.
For those folks, GPT isn’t just a writing tool. It’s a bridge. It helps them show up, express themselves, be part of the conversation. And when someone frames GPT use as a weakness or as somehow less “authentic”, what they’re really doing is making those people feel like they don’t belong.
That might not be their intent. But it’s still the impact. And if we’re serious about building supportive, inclusive communities, then it’s on us to notice that dynamic and change it.
That’s the part I can’t get past. You never really know what someone’s working through behind the scenes. Maybe they used GPT because they wanted their intro to land well in a group of strangers. Maybe they were anxious. Maybe they just wanted to sound like themselves, but didn’t trust the first draft. Whatever the reason that choice deserves respect, not side-eye.
There’s a quote I’ve come back to many times, even used in my email signature at different points: “Everyone you meet is fighting a battle you know nothing about. Be kind. Always.” That applies here too. Especially here. Tools like GPT can be a lifeline for people trying to find the right words and trying to find their place.
The real issue isn’t who used GPT and who didn’t. It’s the posture. The little performance behind saying “no AI here” like it’s a moral stance. That’s the part that starts to curdle the room.
Once you do that, you’re not just sharing your own process. You are implicitly casting judgment on someone else’s. You’re saying, without quite saying it, that their choice is less valid. That your way is more authentic. That you did it “right.”
But writing isn’t a competition for purity points. Especially not in a space that’s supposed to be about support. If someone found a tool that helped them speak clearly in a moment where they might have felt unsure or overwhelmed, that’s not a shortcut. That’s a win.
Nobody’s asking you to use something you’re not comfortable with. But maybe don’t turn your preference into a posture. It helps no one.
Let’s be honest, in a few years, saying you don’t use AI will probably sound a lot like saying you don’t type. Or don’t use spellcheck. Or don’t watch TV because you’re above it. These tools are already part of how we work, communicate, and create. The question isn’t whether we’ll use them. It’s how thoughtfully we’ll choose to.
At the end of the day, I’m not going to think less of you if you use AI. I’m not going to think less of you if you don’t. But I might think a little less of you if you feel the need to call out someone else who does, like their choice somehow makes you better.
Because that’s not strength. That’s insecurity looking for an audience.
And we don’t need more of that in the room.