How To Gain Code Execution | Prime Reacts
Based on The PrimeTime's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
The Cursor installer led to two desktop’s infrastructure, and the compromise path centered on two desktop’s Firebase-backed deployment workflow.
Briefing
A security researcher traced a chain of weaknesses in the Electron app bundling and deployment workflow behind two desktop—an installer for the Cursor AI editor redirected to two desktop’s infrastructure—and demonstrated how it could lead to remote code execution (RCE) on end-user machines. The core finding wasn’t a single bug; it was the way multiple components—Firebase-backed storage, a non-open-source deployment CLI, source maps, and a deployment pipeline—combined into a practical path from reconnaissance to full compromise.
The investigation began with basic recon: the Firebase-backed site exposed source maps, making it easier to enumerate Firestore paths and identify an insecure collection tied to application data. From there, attention shifted to the two desktop CLI (distributed via npm) that handled deployments, source uploads, and related build logic. Using source maps again, the researcher extracted the CLI’s internal structure and then looked for ways to hijack the deployment process. A key step involved an arbitrary S3 upload vulnerability reachable through a Firebase Cloud Function that generated signed URLs. Even without direct S3 credentials, the researcher kept probing and found a high-impact weakness in the deployment pipeline via a post-install script—described as unusually large and risky—allowing a reverse-shell payload to run during installation.
Once inside the build container environment, the researcher located where the application build code lived and uncovered an encrypted prod config.json. The “encryption” was implemented in JavaScript and, crucially, the decryption logic and keys were present in the code itself, meaning the secrets were effectively recoverable. Further digging revealed a hardcoded Firebase admin key with full scope. With those credentials, the researcher could deploy an auto-update to any app managed through two desktop, pushing changes to clients immediately upon restart—turning the pipeline compromise into RCE.
Impact was framed in terms of scale: auto-updates could reach widely used apps such as ClickUp, Cursor, Linear, and Notion, potentially affecting hundreds of millions of tech users if exploited broadly. The researcher reported the issue through coordinated contact with two desktop, and a fix arrived quickly. Compensation was also part of the outcome: the researcher received $5K from two desktop and later $50K from Cursor for an affected customer.
The broader takeaway emphasized operational security failures rather than blaming any single product. The chain highlighted risks of closed-source third-party deployment tooling, reliance on Firebase with exposed artifacts like source maps, and the danger of shipping “encrypted” configuration where decryption keys are available to attackers. The incident also drew praise for responsible disclosure and rapid remediation, with Cursor and two desktop both described as responsive and willing to compensate the finder.
Cornell Notes
The investigation found that a Firebase-backed deployment workflow used by two desktop could be turned into remote code execution on client machines. Source maps exposed internal details, and a risky post-install script enabled a reverse shell during installation. Inside the build environment, “encrypted” prod config.json could be decrypted because the decryption logic and keys were present in JavaScript shipped with the system. A hardcoded, full-scope Firebase admin key then allowed the attacker to push auto-updates to apps managed by two desktop, reaching users when they restarted the app. The issue was reported and fixed quickly, with payouts from both two desktop and Cursor.
How did the researcher move from reconnaissance to a deployment compromise?
What role did Firebase and signed URLs play in the attack chain?
Why was the post-install script considered a turning point?
How did “encrypted” configuration fail to protect secrets?
How did stolen credentials translate into remote code execution for users?
What was the response and why did it matter?
Review Questions
- What specific weaknesses made source maps and the two desktop CLI especially dangerous in this incident?
- Explain how a post-install script can turn a deployment system into a path for reverse shells and later RCE.
- Why does encryption of configuration fail when the decryption keys and logic ship with the client-side or build-side code?
Key Points
- 1
The Cursor installer led to two desktop’s infrastructure, and the compromise path centered on two desktop’s Firebase-backed deployment workflow.
- 2
Exposed source maps accelerated discovery of Firebase paths and internal CLI deployment logic.
- 3
An arbitrary S3 upload vulnerability was reachable through a Firebase Cloud Function generating signed URLs (get signed URL).
- 4
A risky post-install script in package.json enabled reverse-shell execution during installation, letting the attacker reach the build container.
- 5
A “encrypted” prod config.json was effectively decryptable because the JavaScript decryption logic and keys were present in the shipped code.
- 6
A hardcoded, full-scope Firebase admin key allowed attackers to deploy auto-updates to apps managed by two desktop, enabling RCE when clients restarted.
- 7
Two desktop and Cursor responded quickly and compensated the researcher, illustrating effective vulnerability disclosure and remediation.