I watched a person spend an hour arguing with a chatbot on camera, and the bot kept pretending it had never heard a word. You could feel the frustration in the room—sharp, weirdly funny, a little sad—like shouting at a mirror that forgets your face. By the end of the session I had one question: who gets paid to do this?
Memvid, a U.S. startup that builds memory tools for AI agents, has an answer. They’re hiring someone to spend a full eight-hour day on camera testing chatbot memory—and they’ll pay $800 (€740) for that one-day gig.
A beta user repeated a line and the bot blanked: What the role actually asks
You’ll spend a day prodding, baiting, and repeating questions to chatbots to see if they remember earlier context. Memvid calls the role “professional AI Bully,” and the job is explicitly a public stress test: expose memory gaps, document every failure, meltdown, and circular reply, then hand the footage back to the company for promotional use.
It’s not a permanent role. It’s a one-day paid campaign. Still, the set-up is honest: most chatbots can sound fluent for a minute, then lose the thread across a long conversation. I’ve seen assistants forget a name after three prompts; you’ll see it too.
At a demo the assistant lost context after three turns: Why Memvid is staging a stunt
Memvid makes tools that give chatbots long-term recall—its Kora product is pitched as a “memory-powered” assistant that carries user or business history across sessions. The company says many commercial systems suffer a roughly 30% drop in accuracy when asked to remember across long-term interactions, citing a 2025 arXiv study.
That statistical problem becomes a practical problem for people and teams: repeated questions, lost threads, and friction that costs time. Memvid is running a visibility play. Hire one person to dramatize that friction on camera, then show how Kora behaves differently.
Think about it like testing a bridge by marching troops across: you want to see where it creaks. Testing chatbots without memory is like holding water in a sieve—the gaps are obvious once you try to use them for anything meaningful.
At the interview table someone shrugged and said they’d been frustrated by AI before: Who can apply and how it pays
The requirements are simple and oddly specific. Memvid asks for people with an “extensive personal history of being let down by technology,” patience to repeat questions, and comfort in front of a camera. Candidates must also try Kora and share honest feedback as part of the application.
How much does the job pay?
The gig pays $800 (€740) for one full day. For many people who already gripe at chatbots in their spare time, it’s paid catharsis.
What does an AI bully do?
In practice you’ll prod multiple chatbots, ask them to recall earlier context, and log every failure. The session is recorded remotely. Memvid may use that footage in marketing or product demos. Mohamed Omar, Memvid’s cofounder and CEO, told Business Insider they’ll start by hiring a single AI bully and might expand the campaign later.
How do I apply?
Applications require a short online form: questions like “What’s the most frustrating thing AI has ever done to you?” and “Why should you be our professional AI Bully?” You’ll also be asked to test Kora and provide candid feedback to show you can spot memory failures when they happen.
There’s a clear signal here for anyone who builds or uses AI: memory is the next battleground. OpenAI, Anthropic, and other players ship assistants that feel clever until they don’t, and startups like Memvid are betting their product narrative on exposing that gap in public.
If you enjoy poking at problems on camera and you’ve been burned by a forgetful assistant, this is the kind of stunt that can make a point—and a quick $800 (€740)—but would you take a mic to AI and call it out on live tape?