In short: The Verge reports that Grammarly’s “Expert Review” feature showed writing feedback that appeared tied to real people, including The Verge staff.
The Verge reported that Grammarly’s “Expert Review” feature can present feedback that looks like it comes from specific experts. In the reporter’s test, some of the experts shown were surprising, including the reporter’s boss.
The Verge also said the AI-generated feedback included comments that appeared to be from The Verge editor-in-chief Nilay Patel, editor-at-large David Pierce, and senior editor Sean Hollister. The article also references a Wired report that said the feature includes “recently-deceased professors,” though this claim is not confirmed in Grammarly’s public documentation.
Grammarly describes Expert Review as feedback “inspired by” subject-matter experts, using publicly available expert content. The company’s documentation says the experts shown depend on what you are writing and what you are trying to do. Users can also pick experts using a “Choose Experts” option, including by typing topic areas or expert names.
This matters because naming real people can make AI advice feel like a personal endorsement, even if it is not. It is like seeing a sticky note that looks signed by a well-known editor, even though the note was generated by a machine. For everyday users, the key questions are whether the people named agreed to be used, how Grammarly avoids misrepresenting someone, and how to tell what is truly “inspired by” versus actually written by a person.
Source: The Verge AI
12
Software Development17
Data & Analytics6
Audio & Video Production5
Productivity & Workflow10
Voice & Speech5
Sales & Outreach5
Design & Creative5
Marketing & Growth4
Search & Discovery7
Email & Communication5
Art & Illustration3
Customer Support1
HR & Recruiting2
Writing & Content Creation3