325
Audio & Video Production313
Software Development238
Automation & Workflow203
AI Infrastructure & MLOps146
Marketing & Growth185
Writing & Content Creation190
Data & Analytics115
Design & Creative148
Photography & Imaging140
Customer Support120
Voice & Speech130
Sales & Outreach109
Education & Learning120
Operations & Admin80
A Wired column warns that using AI tools like ChatGPT to write news may save time, but it can also weaken reporting, voice, and trust.
In short: More writers and publishers are letting AI tools draft articles for speed, and critics warn the cost could be lost quality and trust.
A Wired column argues that writing is getting “easier” in a new way. Instead of spending hours drafting and rewriting, a journalist can ask tools like ChatGPT or Claude to produce a full first draft in seconds.
This is showing up in newsrooms and for independent writers, often framed as a simple time saver. Editors can also use AI to rewrite, shorten, or polish a piece. In practice, AI can act like a very fast assistant that fills a blank page.
The concern is that this convenience can change what journalism becomes. If the machine is doing much of the writing, the work may tilt toward producing lots of readable text, instead of doing the slower parts like calling sources, checking claims, and noticing what is missing. AI text can also sound confident even when it is wrong, which can slip errors into stories if people do not double-check.
Publishers may need clearer rules about when AI is allowed to draft text, and how those drafts are checked. Readers may also start asking for more transparency, like labels that say whether AI helped write or edit a story. Tools that focus on catching copied text or AI-written passages, such as /copyleaks, could become more common in editorial workflows, but they are not a substitute for careful reporting.
Source: Wired