I opened the inbox and found an offer: cash for the startup’s Slack history. The founder I was with stared at the number and then at the logout button. Neither felt like a clean ending.
I’ve been watching the scrapyards of startups for years. You see the same pattern: founders settle debts, fold legal entities, and try to salvage something. Now that salvage increasingly includes the company’s communications—Slack threads, emails, documents—sold to companies training AI.
A founder in a room with a closing checklist found a contract offering cash for her company’s archives
Startups that are shutting down are being pitched by services that will package internal data and sell it to AI firms. SimpleClosure, which now markets a product called Asset Hub, says it helps founders identify what can be monetized, scrub personally identifiable information, and license code, docs, and internal chats.
SimpleClosure’s CEO Dori Yona told Forbes the company has processed nearly 100 deals over the past year, with payouts ranging from $10,000 (€9,200) to $100,000 (€92,000) per company. For a founder strapped for cash, that can feel like oxygen.
Can companies sell employee data to train AI?
Short answer: yes, if the startup—or whoever controls the assets—has legal authority and the buyers accept the risk. But legal authority and consent aren’t the same as ethical clarity. Platforms like Slack and Gmail are part of the chain, but the transaction usually happens off-platform through intermediaries who claim to scrub identities.
An HR manager discovered that a Slack thread about a birthday party became a training scenario for an AI
AI labs now build “reinforcement learning gyms”—simulated workplaces where agents practice tasks using realistic company data. Anthropic reportedly discussed spending up to $1 billion (€920,000,000) on such environments, according to The Information. That signals a hunger for context-rich, messy workplace interactions that public web crawls don’t capture.
The rush feels like a morgue auction—assets sold off piece by piece while people and their conversations are left out in the open.
Are Slack messages private?
Not entirely. Slack terms and a company’s admin controls create gray zones: employers often retain ownership or access to workspace data. Once an administrator exports or hands over archives, privacy controls on the platform no longer protect the content.
A privacy advocate read the contracts and sent letters to the Senate
Marc Rotenberg of the Center for AI and Digital Policy warned that employee privacy is at stake. CAIDP has urged the Senate Commerce Committee to push the FTC to enhance oversight of companies selling and buying this kind of data. Their concern is straightforward: internal messages are often tied to identifiable people, and de-identification is imperfect.
How much do companies pay for old startup data?
Payouts vary widely. Forbes reports deals in the $10,000–$100,000 (€9,200–€92,000) band for entire company archives. Prices depend on the perceived quality and structure of the data, the presence of code or product assets, and the buyer’s intended use—particularly for RL gyms that value sequential interaction logs.
An investor noted buyers are treating workplace chatter as a new commodity
Big names in AI—OpenAI and Anthropic among them—are openly hunting for realistic training material. You can imagine investors and startup wind-down firms viewing archived conversations as an asset class. The sellers promise cleaning and redaction; the buyers promise better models. The archive, however, can act like a Trojan horse, bringing subtle personal signals into opaque training pipelines.
I’ll tell you what to watch for: clauses that assign ownership of communications, promises about anonymization with no verification path, and buyers linked to large labs planning RL environments. If you’re a founder, employee, or investor, read the fine print and ask who is ultimately entitled to say “sell.”
Do you want your old messages to train the next generation of office bots without your say-so?