Anthropic Updates Claude Privacy Policy to Use User Chats for Training, Offers Opt-Out

Key Points

  • Anthropic will use new and resumed Claude chats for AI training by default.
  • The “Help improve Claude” option can be turned off during sign‑up or later in privacy settings.
  • Opt‑out must be selected by Sept. 28 for continued access to Claude.
  • Enterprise, government, and education Claude plans are excluded from the policy.
  • Opted‑in users’ data will be retained for up to five years, up from 30 days.
  • Longer data retention aims to detect misuse and improve safety.
  • Web and mobile steps to disable training are provided in Anthropic’s settings.

Anthropic Wants to Use Your Chats With Claude for AI Training: Here's How to Opt Out

Policy Change Overview

Anthropic disclosed an update to its Consumer Terms and Privacy Policy that will allow the company to use user chat transcripts from its Claude chatbot to train future AI models. The change applies to all individual users on Claude Free, Pro, Max, or Code plans. The new setting, labeled “Help improve Claude,” is turned on by default for new sign‑ups and existing users will receive a notification explaining the change.

Opt‑Out Mechanism

Users can disable the training option at any time. New users encounter the toggle during the sign‑up flow, while existing users can turn it off in the Claude privacy settings on both web and mobile interfaces. After the deadline of Sept. 28, users must make a selection to continue using Claude.

Plans and Services Excluded

Claude for Work plans (Team and Enterprise), Claude Gov, and Claude Education are not subject to the new training policy. Additionally, API usage for third parties, including Amazon Bedrock and Google Cloud’s Vertex AI, remains unaffected.

Data Retention Changes

For users who opt in, Anthropic will retain chat data for up to five years, compared with the previous 30‑day window. The longer retention period is intended to help the company detect harmful usage patterns and identify misuse.

How to Adjust Settings

On the web, users click the user icon, select Settings, choose Privacy, and toggle “Help improve Claude” off. On mobile, users tap the menu icon, go to Settings, select Privacy, and turn the option off.

Impact on Existing Data

Data from chats that have already been included in training will remain part of the models, even if users later opt out. However, future new or resumed chats will no longer be used for training once the opt‑out is active.

Source: cnet.com