Skip to content

Archives

how a roblox cheat and one AI tool brought down vercel’s entire platform

  • how a roblox cheat and one AI tool brought down vercel's entire platform

    Damn, this is an absolute indictment of the state of security in AI tooling:

    February 2026. An employee at Context.ai, one of those AI productivity tools that promises to "supercharge your workflow," downloads a Roblox cheat. Not a sophisticated zero-day. Not a state-sponsored attack. A Roblox cheat. The download contains Lumma Stealer, an infostealer that grabs session cookies, credentials, everything. That employee had access to sensitive internal systems.

    March 2026. The attacker uses Context.ai's compromised infrastructure to pivot into a Vercel employee's Google Workspace account. This Vercel employee had signed up for Context.ai's "AI Office Suite" using their enterprise credentials and granted "Allow All" permissions. Let that sink in for a second. A Vercel engineer gave a third-party AI tool full access to their corporate Google account.

    April 19. Guillermo Rauch posts the thread confirming everything. Environment variables [...] were stored in plaintext. Accessed. Exfiltrated.

    tl;dr:

    1. Context.ai employees should not be using company devices to access Roblox cheats;

    2. exfiltratable environment variables should not be usable to access a customer's Google account. The scope of these credentials was obviously way too broad.

    This isn't just a Context.ai issue, this is systemic.

    Tags: security infosec credentials google context.ai roblox fail