The latest wave of AI excitement has brought us an unexpected mascot: the lobster. Clutchpersonal AI assistant, went viral a few weeks after its launch, and will keep the crustacean theme even if it has to rename it to Moltbot after a legal challenge from Anthropic. But before you jump on the bandwagon, here’s what you need to know.
According to its tagline, Moltbot (formerly Clawdbot) is “an AI that actually does things” – whether it’s managing your calendar, sending you messages through your favorite apps, or checking you in for a flight. This promise has attracted thousands of users who are willing to deal with the technical setup required, even though it started as a poor personal project built by one developer for his own use.
The man was Peter Steinberger, an Austrian developer and founder which is known online as @steipete and actively blogs about his work. After leaving the previous job, PSPDFkitSteinberger felt empty and barely touched the computer for three years, explained in his blog. But he finally did found the sparkle again – which leads to Moltbot.
While Moltbot is now more of a solo project, the publicly available version is still from was shot“Peter’s crusted assistant,” now called Molty, a tool created to help him “manage his digital life” and “explore what human-AI collaboration is all about.”
For Steinberger, this means diving deeper into the momentum around AI that has saved his work. A self-professed “Claudoholic”.they first named the project after their AI flagship product Anthropic, Claude. He announced at X that Anthropic is next forced him to change the brand for copyright reasons. TechCrunch has reached out to Anthropic for comment. But the “lobster soul” of the project remain unchanged.
For early adopters, Moltbot represents the wife of how helpful the AI assistant is. People who are already excited about the prospect of using AI to quickly generate websites and apps are even more excited to have their personal AI assistant do the work for them. And like Steinberger, he wants to.
This explains how Moltbot collected more than 44,200 stars on GitHub so fast. So much viral attention has been paid to Moltbot that it has even moved the market. Cloudflare savings up 14% in premarket trading there when the social media buzz around the AI agent re-sparked investor enthusiasm for Cloudflare’s infrastructure, which the developer used to run Moltbot locally on the device.
Techcrunch event
San Francisco
|
13-15 October 2026
Still, it’s a long way to get out of early adopter territory, and that’s probably for the best. Installing Moltbot requires tech savvy, and it also includes an awareness of the security risks involved.
On the one hand, Moltbot is built with safety in mind: it’s open source, which means anyone can check the code for vulnerabilities, and run it on a computer or server, not in the cloud. But on the other hand, the premise is very risky. As businessman and investor Rahul Sood shown in X“‘can do anything’ means ‘can execute any command on your computer.'”
What keeps Sood up at night is “quick injection via content” – where a malicious person can send a WhatsApp message that can lead Moltbot to take an unintended action on a computer without intervention or knowledge.
These risks can be partially mitigated by careful setup. Since Moltbot supports multiple AI models, users may want to make setup choices based on their resistance to attacks. But the only way to prevent it completely is to run Moltbot in a silo.
This may be obvious to experienced developers who are working on a project that has been going on for weeks, but some have been more vocal in warning users who are attracted by the hype: it can happen very quickly if the approach is as reckless as ChatGPT.
Steinberger himself was provided with a reminder that bad actors were present when he “messed up” changing the name of the project. He complained in X that “crypto scammers” take over GitHub username and created a fake cryptocurrency project in his name, and he warned his followers that “any project that lists (him) as a coin owner is a SCAM.” He then posted a GitHub issue has been fixedbut warns that the legitimate X account is @moltbot, “no 20 variations of scam.”
This doesn’t mean you should avoid Moltbot at this stage if you want to try it. But if you’ve never heard of a VPS — a virtual private server, which means a remote computer you rent to run software on — you might want to wait your turn. (That’s where you want to run Moltbot now. “Not a laptop with SSH keys, API credentials, and a password manager,” Sood warns.)
Currently, running Moltbot securely means running it on a separate computer with a disabled account, which defeats the purpose of having a useful AI assistant. And fixing the security-versus-utility trade-off may require solutions beyond Steinberger’s control.
Instead, by building tools to solve their own problems, Steinberger is showing the developer community what AI agents can do, and how autonomous AI can ultimately be useful rather than just impressive.

