BETA RELEASE

Summary

Anthropic updated Claude's terms to use consumer chat data for model training unless users opt out by September 28, 2025, or risk losing access to the service.

Key quotes

Starting today, Claude users will be asked to either let Anthropic use their chats to train future AI models or opt out and keep their data private.
If you don’t make a choice by September 28, 2025, you’ll lose access to Claude altogether.
unless you opt out, your data will be stored for up to five years and fed into training cycles to help Claude get smarter.

The policy change affects Free, Pro, and Max plans, including Claude Code. Business, government, education, and API users are exempt from these training terms.