Confidentiality at the Prompt: Protecting Client Data When Using AI Platforms
When an attorney enters client facts into a commercial AI tool, who owns that data? North Carolina Rule 1.6 imposes strict limits that most standard AI terms of service do not satisfy.
Rule 1.6 of the North Carolina Rules of Professional Conduct prohibits attorneys from revealing information relating to the representation of a client unless the client gives informed consent or an exception applies. Most attorneys understand this in the context of conversations and documents. The same obligation attaches, with equal force, to prompts entered into AI systems.
What Happens to Your Prompts
Consumer versions of general-purpose AI tools frequently retain user inputs for model training and quality improvement. Enterprise agreements often -- but not always -- exclude customer data from training. Before entering any client-identifying information into an AI platform, attorneys must review the applicable terms of service and data processing agreement, understand what data is retained, for how long, and by whom, and determine whether any consent exceptions under Rule 1.6(b) apply.
A Practical Framework for NC Attorneys
The safest practice is to use anonymized or hypothetical fact patterns when consulting AI tools for research or drafting assistance. Where identifying information is necessary, use only enterprise-tier tools with signed data processing agreements that explicitly prohibit training on customer data. Update your engagement letters and privacy policies to disclose your firm's AI practices.