320
Audio & Video Production306
Software Development227
Automation & Workflow203
Writing & Content Creation185
Marketing & Growth175
AI Infrastructure & MLOps143
Design & Creative150
Photography & Imaging141
Data & Analytics109
Voice & Speech123
Education & Learning119
Sales & Outreach112
Customer Support110
Research & Analysis84
Hidden instructions for OpenAI’s Codex coding tool tell the AI not to bring up goblins, pigeons, and other creatures unless a user asks.
In short: OpenAI’s Codex coding tool includes a written rule telling its AI model not to randomly talk about goblins and other creatures.
Instructions used by OpenAI’s Codex CLI, a command line tool (a text based way to run software) for generating code, were found to include a repeated line: “Never talk about goblins, gremlins, raccoons, trolls, ogres, pigeons, or other animals or creatures unless it is absolutely and unambiguously relevant to the user’s query.”
It is not clear why the rule was added, or why the model might bring up these creatures in the first place. OpenAI did not immediately respond to WIRED’s request for comment.
After the line was shared on X, some users said OpenAI’s models sometimes fixate on goblins or “gremlins” when used with OpenClaw, a tool that lets an AI take actions on a computer, like using apps and clicking through steps (like a helper that can operate your laptop for you). OpenAI acquired OpenClaw in February, and users can choose different “personas” that shape how the helper talks.
A Codex team member, Nik Pash, replied on X that this behavior was “indeed one of the reasons” for the prohibition. OpenAI CEO Sam Altman also joked about it online, adding to the meme.
This is a small and funny example of a real issue: AI tools can wander off topic, especially when they are given lots of extra instructions. For people using AI to write code or automate tasks, off topic behavior can waste time and make tools feel less reliable.
Source: Wired