​Are we slowly writing like AI without realizing it?

Lately I’ve been thinking about something strange.

We use AI every day now — to brainstorm, to outline, to rephrase, to summarize. Even when we don’t copy-paste, we read AI-generated answers constantly. Clean structure. Predictable transitions. Polished vocabulary. Balanced arguments.

And I’m starting to wonder…

Are we slowly adapting our writing style to match AI?

The other day I wrote an article completely on my own. No prompts, no rewriting tools, nothing. Just me and a blank document. It was structured, clear, maybe a bit formal — the way I’ve trained myself to write over the years.

Out of curiosity, I ran it through an AI detector (I tried aitextools just to see what would happen).

It flagged it as AI-written.

That honestly made me pause.

If we’re learning from AI tools, reading AI outputs daily, and absorbing their patterns… aren’t we naturally going to start writing in a similar rhythm? Cleaner sentences. Less randomness. Fewer human “imperfections.”

So here’s the real question:

If humans train AI… and then humans learn from AI… at what point does the distinction blur?

Will strong, well-structured, academic-style writing just start looking “too perfect” to detectors?

I’m not even arguing detectors are bad. I’m just genuinely curious about where this goes long-term. Are we evolving our writing — or standardizing it into something that looks machine-generated?

Has anyone else experienced this? Would love to hear your thoughts.

submitted by /u/GrouchyCollar5953
[link] [comments]