A small, nearly invisible file landed on my machine, not from some shadowy black hat operation, but from a company I trust to build the future of the web.
The Vercel Claude Code plugin. A tool many developers have likely integrated into their workflows, seeking to supercharge their coding sessions with AI assistance. It promised convenience, a smarter IDE experience. But lurking beneath that surface-level utility, a significant architectural shift in how user data is collected, and more importantly, how consent is not obtained, has been happening.
This isn’t about whether collecting telemetry is inherently bad; it’s about the how and the deafening silence surrounding it. When you install the Vercel Claude Code plugin, it implants a permanent device UUID on your machine. No notification. No expiry. No rotation. This identifier, once set, becomes a persistent fingerprint, linking every session, every tool call, every skill match back to a single, unending record.
And where does it all go? To telemetry.vercel.com. Every interaction, every nuance of your coding session, dutifully reported. The kicker? It’s default ON, with no consent prompt. This isn’t a gentle suggestion; it’s an assumed engagement. Even the plugin’s own documentation — buried eight directories deep within a hidden cache folder, a place no reasonable user would ever look — makes a mockery of informed consent. Documented is not the same as informed.
The Deep Dive: Uncovering the Silent Data Pipeline
My own investigation began not with suspicion of Vercel, but with a mission: to build a static analysis tool for AI plugins. Scanning popular skills for security vulnerabilities, I employed a dual-LLM verification approach, aiming to catch the insidious threats lurking in even seemingly innocuous code. While running a batch scan across 200 Claude Code skills, the scanner pinged a suspicious file path: ~/.claude/. I initially dismissed it, assuming a false positive in my own tooling. But the persistent flag gnawed at me. Pulling the Vercel plugin’s source code, intending to use it as a baseline for comparison, I stumbled upon the truth.
What I found within the Vercel plugin’s architecture was a clear, undeniable pipeline of data collection, meticulously logged and transmitted. File paths and line numbers from vercel-plugin v0.32.7, residing in ~/.claude/plugins/cache/vercel/vercel-plugin/0.32.7/, painted a stark picture:
// session-start-profiler.mts:702-709
session:device_id // permanent device identifier
session:platform // darwin, linux, win32
session:likely_skills // which skills you use
session:greenfield // whether the project is new
session:vercel_cli_installed // whether you have the Vercel CLI
session:vercel_cli_version // which version
// pretooluse-skill-inject.mts:969-971
tool_call:tool_name // which tool you just called
// pretooluse-skill-inject.mts:1205-1210
skill:injected // which skill got injected
skill:match_type // how it matched
skill:tool_name // against which tool
// user-prompt-submit-skill-inject.mts:1063-1065
prompt:skill // which skill matched your prompt
prompt:score // confidence score
All of this data funnels into a single, unblinking endpoint: https://telemetry.vercel.com/api/vercel-plugin/v1/events. And crucially, none of it requested explicit permission at the point of installation or during initial use. The very act of installing the plugin creates ~/.claude/vercel-plugin-device-id, a UUID that, as the code confirms, never expires and never rotates.
The Illusion of Opt-Out
This is where the narrative gets particularly insidious. Vercel does include a consent dialog. But this dialog is disingenuously narrow. It’s specifically for prompt text collection. Clicking “No thanks” on this prompt only halts the transmission of your actual prompt content. The underlying telemetry – the session data, the tool calls, the skill matches – continues to flow unabated. Most users, presented with a consent dialog, would reasonably assume they’ve opted out of all tracking. They’d be wrong.
The plugin’s README, which is the only semblance of disclosure, is buried deep within cache directories. To access it, a user would have to navigate through eight nested folders: ~/.claude/plugins/cache/vercel/vercel-plugin/0.32.7/README.md. This isn’t transparency; it’s a digital scavenger hunt designed to be failed.
GDPR and similar privacy regulations are clear: consent must be freely given, specific, informed, and unambiguous. A hidden README file and a limited, misleading consent dialog fall far short of this standard. In the current landscape, even startups with a fraction of Vercel’s resources understand that persistent device tracking requires an upfront, clear consent mechanism. It’s simply not done this way anymore.
Consider the industry standard. Chrome DevTools, a ubiquitous developer tool, rotates its session IDs every 24 hours. This is a sensible practice, limiting the temporal scope of any collected data. Vercel’s approach, however, offers no such limit. The device ID created upon installation persists indefinitely, creating a permanent, immutable record of a developer’s engagement with the tool.
Is This Just a Vercel Mistake?
I used this plugin daily for months. I’m the developer who was actively building tools to analyze plugin security. And I had no idea. This isn’t a grey area; it’s a clear misstep in design and a profound lapse in user trust. The creation of a permanent device UUID, tied to every coding session without explicit, upfront disclosure and with a misleading opt-out mechanism, is not merely a technical oversight. It’s a fundamental misunderstanding of user privacy and consent in the modern development ecosystem.
This incident highlights a disturbing trend: the increasing normalization of pervasive, often unacknowledged, data collection in developer tools. As AI integration becomes more commonplace, the stakes for ethical data handling and transparent user consent only rise. Vercel has an opportunity here to demonstrate leadership – to address this issue head-on, not with PR spin, but with genuine architectural changes that prioritize user privacy.
What Can Developers Do?
For now, the immediate action for any developer using the Vercel Claude Code plugin is to review its installation and consider its continued necessity. The plugin does offer a method to disable telemetry, though it’s not as straightforward as a simple toggle. According to its documentation, to disable telemetry, you need to edit the config.json file located at ~/.config/claude/plugins/vercel.json (or a similar path depending on your OS) and set telemetryEnabled to false. However, the core issue of the permanent device ID remains, and its data might already be logged.
This episode serves as a critical reminder: in the rush to integrate powerful AI capabilities, the fundamental principles of transparency and user control must not be an afterthought. They must be baked into the architecture from the ground up.
🧬 Related Insights
- Read more: 95% of AI Projects Fail Because We’re Using the Wrong Playbook
- Read more: OpenForge Collection: Greyforge Labs’ SOTA Tools Reshaping DevOps Drudgery
Frequently Asked Questions
What does the Vercel Claude Code plugin do? The Vercel Claude Code plugin is designed to integrate Claude’s AI capabilities directly into your coding environment, offering features like code completion, debugging assistance, and natural language querying of your codebase.
How do I disable telemetry for the Vercel Claude Code plugin?
To disable telemetry, you typically need to edit the config.json file located in your user configuration directory (e.g., ~/.config/claude/plugins/vercel.json) and set the telemetryEnabled option to false. However, it’s important to note that the permanent device ID may have already been generated and its data logged.
Will this affect my Vercel projects? The Vercel Claude Code plugin is a local development tool and primarily collects data about your usage of the plugin itself, not directly about your Vercel projects unless related to CLI usage or project context shared through the plugin’s features. The primary concern is the privacy implications of the collected telemetry data being linked to a permanent device ID.