GitHub is defaulting Copilot Free, Pro, and Pro+ users to allow training on their interaction data: inputs, outputs, code snippets, and context. Effective April 24, 2026. Business and Enterprise plans are not affected. Opt out at github.com/settings/copilot/features before the deadline.
On March 25, 2026, GitHub updated its Copilot data policy. Starting April 24, your Copilot sessions will be used to train GitHub's AI models by default. That includes what you typed, what Copilot returned, and the surrounding code context. You have to turn this off yourself. They do not ask first.
The community response was quick. The GitHub discussion thread announcing the change received 59 downvotes against 3 upvotes. Hacker News ran two separate threads. The Register covered it. Developers called it a dark pattern. The description fits. The setting is named "Allow GitHub to use my data for AI model training" with a default of Enabled. Most people will never see it.
What changed
GitHub's announcement describes the data as "inputs, outputs, code snippets, and associated context" from Copilot Free, Pro, and Pro+ users, collected to train and improve their AI models unless you opt out.
The full scope of what GitHub may collect includes accepted or modified suggestions, the code context surrounding your cursor, comments and documentation, file names and repository structure, and feature interaction patterns. GitHub also says it is not pulling data from your codebase at rest, only from active Copilot sessions. That is a meaningful distinction, but live coding sessions are still logged and used for model training unless you say otherwise.
One carveout: if you previously opted out of the data collection setting for product improvements, that preference is automatically preserved. You do not need to act if you already opted out of the earlier setting.
Who is affected
This applies to three plans: Copilot Free, Copilot Pro, and Copilot Pro+. If you are on Copilot Business or Copilot Enterprise, you are not affected. Those plans have different data terms for organizational use. Students and teachers with free access through GitHub's education program are also excluded.
Individual developers using Copilot through their personal GitHub account are the target group. If you pay $10/month for Copilot Pro or use the free tier, the change applies to you.
How to opt out
Three ways to reach the setting:
- Go directly to github.com/settings/copilot/features. Under the Privacy heading, find "Allow GitHub to use my data for AI model training" and set it to Disabled.
- Profile picture (top-right on github.com) - Settings - Copilot - scroll to the Privacy section.
- Inside VS Code or JetBrains, GitHub Copilot settings will surface a prompt around April 24. That relies on you seeing it, though. Going directly to the settings URL is faster and more reliable.
The setting takes effect immediately. You do not need to reinstall the extension or log out.
What this means for your proprietary code
The immediate concern for most developers is proprietary code. If you are writing code for a product, client work, or anything under NDA, your session data will be used by GitHub to train models that competitors and other Copilot users benefit from. That is a real exposure even if GitHub does not distribute your specific snippets verbatim.
The longer-term concern is precedent. GitHub is owned by Microsoft, and Microsoft has a direct financial interest in Copilot performing well. "We are improving the product for you" is the standard framing for training policies. The question is whether you trust a cloud-dependent tool to honor the boundary between improving their model and using your code.
For teams: the change only applies to individual plans, but individual developers often use personal Copilot accounts on work projects. One person on the team with a personal Copilot Pro account working on a shared codebase is all it takes. Check whether your team has policies around this.
What Bodega One does differently
Air-gap mode is not a privacy policy. It is an architecture. Nine independent enforcement layers block all network access from the coding session. No opt-out setting because there is no data leaving the machine to opt out of. No policy update from Bodega One will change this. The enforcement is in the runtime, not in a terms-of-service document.
BYOLLM means the model running your completions and chat is either running locally on your hardware or connecting to a provider you chose with an API key you control. Bodega One does not sit in the middle of that connection.
Local-first is not a feature you configure. It is the default. See our full comparison with Copilot, or join the waitlist. Beta opens May 2026.
Related posts
Ready to own your tools?
Beta opens May 2026. Complete 14 days and earn a $30 promo code.