BETA RELEASE

Summary

A guide on managing data privacy for GPT-5, detailing opt-out mechanisms for different account types and enterprise governance strategies to prevent model training.

Key quotes

Training data opt-out prevents OpenAI from using your conversations to improve future versions of GPT models.
Opting out of training doesn't eliminate all data storage. OpenAI retains data temporarily for abuse monitoring, safety enforcement, and operational purposes.
Enterprise and Team accounts operate under contractual Data Processing Addendums (DPAs) that legally prohibit OpenAI from using organizational data for model training.

The article provides technical distinctions between training and inference data and outlines step-by-step workflows for opting out of data training. It also introduces Zero Data Retention (ZDR) and enterprise-grade AI governance frameworks.