Early Friday morning, OpenClaw creator Peter Steinberger shared a stark update with his followers on X: moving forward, keeping OpenClaw compatible with Anthropic’s AI models will only grow more difficult. He attached the post to a screenshot of a notification from Anthropic, alerting him that his account had been suspended over what the company categorized as "suspicious" activity.
The ban did not last long. Just a few hours after Steinberger’s post went viral, he confirmed his account had been reinstated. Of the hundreds of comments that rolled in on the thread—many of which leaned into conspiracy theories, fueled by the fact that Steinberger currently works for OpenAI, Anthropic’s top rival—one notable comment came from an Anthropic engineer. The engineer told the well-known developer that Anthropic has never banned users for using OpenClaw, and offered to help resolve the issue.
It remains unclear whether that intervention was what led to the account being restored (we have reached out to Anthropic for additional comment on the incident). Still, the full sequence of events lays bare rising tensions between Anthropic and the popular open-source AI tool.
To recap the recent backstory: this suspension came one week after Anthropic announced that Claude subscriptions would no longer cover "third-party harnesses including OpenClaw". OpenClaw users now must pay for their Claude usage separately, priced by consumption, through Claude’s API. In effect, since Anthropic builds its own competing in-house agent called Cowork, the new policy amounts to what the community has dubbed a "claw tax". Even before the suspension, Steinberger confirmed he was already complying with the new rule and using the API for his work, making the ban unexpected.
Anthropic has defended the pricing change, arguing that standard subscription plans were never built to support the unique "usage patterns" of AI harness tools. Harnesses require far more computing power than standard prompts or simple scripts, because they often run continuous reasoning loops, automatically repeat or retry tasks, and integrate with a wide range of external third-party tools.
Steinberger, however, rejects that explanation. Shortly after the pricing change was announced, he noted the suspicious timing of the move: "Funny how timings match up, first they copy some popular features into their closed harness, then they lock out open source." While he did not name specific features, the comment is widely interpreted as a reference to Claude Dispatch, a new tool added to Anthropic’s Cowork agent that lets users remotely control agents and assign them custom tasks. Dispatch launched just two weeks before Anthropic rolled out its new OpenClaw pricing policy.
Steinberger’s long-running frustration with Anthropic was on full display again during Friday’s X thread. One commenter implied the conflict was Steinberger’s own fault for choosing to join OpenAI instead of Anthropic, writing: "You had the choice, but you went to the wrong one." Steinberger’s blunt reply laid bare the bad blood between the two sides: "One welcomed me, one sent legal threats."
Many followers also asked why Steinberger even uses Claude at all for his work, when he could rely on models from his employer OpenAI. He clarified that he only uses Claude for compatibility testing, to ensure updates to OpenClaw do not break functionality for OpenClaw users who rely on Claude.
"You need to separate two things," he explained. "My work at the OpenClaw Foundation where we wanna make OpenClaw work great for any model provider, and my job at OpenAI to help them with future product strategy."
Multiple observers also pointed out that Claude remains a far more popular choice for OpenClaw users than OpenAI’s ChatGPT. When that trend was brought up to Steinberger following Anthropic’s pricing change, he simply replied "Working on that"—a small clue that hints at the focus of his product strategy work at OpenAI.
Steinberger did not respond to a request for comment for this report.
Anthropic temporarily banned openclaws creator from accessing claude