A staff accountant at a 35-person CPA firm is working on a complex Schedule K-1 allocation. It's 9pm, the deadline is tomorrow, and she knows ChatGPT can help her think through the analysis. She pastes the relevant figures — including entity names, EINs, and partner SSNs — into the chat. The AI produces a useful response. She finishes the return. Nobody finds out.
Multiply this scenario across every accountant in every firm using AI tools without governance. The data that left the firm in that interaction — Social Security numbers, Employer Identification Numbers, financial account details, and client business information — has now been processed by a third-party system operating under terms of service that the firm has never reviewed, with data handling practices that may include retention, training use, and third-party sharing that the firm's clients have never consented to.
This is not a hypothetical risk. It is the current operating reality at the majority of CPA firms in the United States.
The AICPA Ethics Obligation You're Likely Violating
AICPA ET Section 1.700.001 — the Confidential Client Information Rule — prohibits a member in public practice from disclosing any confidential client information without the specific consent of the client. The rule applies to all forms of disclosure, including disclosure to service providers and technology vendors.
AICPA's Interpretation 1.700.040 addresses cloud computing and specifically notes that members using third-party service providers must have a reasonable basis to conclude that the service provider will not disclose confidential client information and maintains appropriate security.
The critical question: Does using a third-party AI tool with client data constitute "disclosure" under ET Section 1.700? The answer is yes — when a CPA submits client data to a third-party system, that data leaves the firm's control. If the vendor's terms permit training on that data, retention of it, or access by vendor employees, the client's confidential information has been disclosed without their consent.
The practical implications are significant. A firm that cannot demonstrate that it has evaluated AI vendors against these obligations, implemented controls to prevent unauthorized disclosure, and obtained appropriate client consent where required is operating with an unaddressed professional ethics gap. In the event of a client complaint, a state board investigation, or litigation, the absence of any AI governance program will be difficult to defend.
The Five AI Tools Most Likely in Use at Your Firm Right Now
Shadow AI — AI tools used by staff without formal approval — is the governance problem most CPA firms have not addressed because they do not know the scale of it. Based on industry survey data and typical adoption patterns, the following tools are the most common sources of unauthorized client data exposure at accounting firms:
ChatGPT (OpenAI) — Free and Plus Versions
The free and Plus versions of ChatGPT use conversation inputs for model training by default. OpenAI's enterprise plan (ChatGPT Enterprise or API access with zero data retention) opts out of training, but the free version does not. A staff member using their personal ChatGPT account for work tasks is, by default, providing client data for model training with no contractual protections for the firm.
Microsoft Copilot — Consumer vs. Enterprise
Microsoft has two distinct Copilot products with very different data handling practices. Microsoft 365 Copilot (the enterprise product) operates within your Microsoft 365 tenant with data protection commitments. Copilot.microsoft.com (the consumer product, available free through a browser) operates under consumer terms that may allow Microsoft to use inputs for model improvement. Staff using the wrong product may believe they have enterprise protections when they do not.
Grammarly
Grammarly processes the text it checks — including emails, documents, and any other text in applications where the Grammarly extension is active. Grammarly's enterprise plan provides stronger data protections. Free and premium personal accounts operate under different terms. A staff member with Grammarly installed will have the extension active in their email client and document editor — including content containing client information — unless explicitly configured otherwise.
AI Meeting Transcription Tools
Otter.ai, Fireflies, and similar tools record and transcribe meetings. If a staff member uses one of these tools during a client call or a call where client matters are discussed, the transcript — containing client business information and potentially privileged communications — is processed and stored on a third-party platform. Most of these services operate on consumer terms unless explicitly upgraded to an enterprise plan with appropriate data protections.
Tax Software Embedded AI
Major tax software platforms — including products from Thomson Reuters, Wolters Kluwer, and Intuit — are embedding AI features at varying rates. Some of these features are covered by existing business agreements with appropriate data protections. Others are opt-in features with separate terms. Firms must review the AI feature terms in every software platform they use, not assume that the existing agreement covers AI functionality.
A Practical 7-Step AI Governance Framework for CPA Firms
The following framework is designed to be implementable by a CPA firm without a dedicated IT or compliance team. Each step produces a specific, actionable output.
Conduct a Shadow AI Discovery Survey
Survey all staff — anonymously — asking what AI tools they use for work and what types of information they enter. The goal is discovery, not discipline. Most firms discover 5 to 10 AI tools in use that leadership had no knowledge of. Axiom Sovereign provides a ready-to-use survey template that takes less than an hour to distribute and analyze.
Create an AI Tool Inventory
Document every tool identified: vendor name, plan type (free vs. enterprise), data types typically entered by staff, and whether a Data Processing Agreement is available. This inventory is the foundation of your governance program — you cannot govern what you have not documented.
Review Vendor Terms for Each Tool
For each tool in your inventory, review the terms of service and privacy policy. The four critical questions: Does the vendor train on your inputs? How long does the vendor retain inputs? Can you opt out? Who at the vendor can access your data? Document your findings for each tool. This review typically takes 30 to 60 minutes per tool.
Classify Each Tool and Create an Approved List
Based on your vendor review, classify each tool: Approved (cleared for use with appropriate data types), Conditional (approved with specific restrictions — for example, no client SSNs or financial account data), or Prohibited (not approved for use with client data until vendor protections are confirmed). Create a written Approved AI Tool List and distribute it to all staff.
Write and Deploy an AI Acceptable Use Policy
The policy must cover: the Approved Tool List by name, specific data types that may never be entered into any AI tool (SSNs, EINs, financial account numbers, client names paired with financial data), consequences for violations, and how to request approval for new tools. Every staff member signs acknowledgment. Axiom Sovereign provides a template specifically drafted for professional services firms.
Execute Data Processing Agreements with Approved Vendors
For any AI tool that will process client personal information — including names, contact details, and financial data — execute a Data Processing Agreement. For AI tools that process tax data that could be linked to a specific individual, review whether the vendor qualifies as a tax return preparer under IRC 7216 and whether written consent requirements apply.
Train Staff and Establish a Review Cadence
All staff complete AI governance training before using any approved AI tool. Training covers the approved tool list, prohibited data types, professional ethics context (not just rules — the reasoning), and how to report concerns. Establish a quarterly review of the approved tool list, since vendor terms change frequently and new tools emerge constantly.
What "Approved for Use" Actually Requires
Approving an AI tool for use at a CPA firm requires more than confirming the vendor has good security. It requires confirming that the specific data processing the tool performs is consistent with your professional ethics obligations and client agreements. The minimum requirements for an AI tool to be approved for use with client tax data are:
- Vendor confirms in writing (in their terms or in a separately executed DPA) that inputs are not used for model training
- Data is processed and stored within the United States (or the vendor can demonstrate adequate protections for any cross-border processing)
- Vendor will execute a DPA that includes security requirements, breach notification obligations, and sub-processor disclosure
- Vendor has SOC 2 Type II or equivalent independent security audit completed within the past 12 months
- Vendor's terms are consistent with IRC 7216 requirements for disclosure of tax return information to service providers
"The firms that get this right are not the ones with the most restrictive policies — they are the ones with clear, documented policies that staff actually understand and follow. A policy that nobody reads is not a governance program."
Client Disclosure: The Conversation You Need to Have
AICPA guidance on AI use is evolving rapidly. Emerging best practice is to address AI use in engagement letters — disclosing that the firm uses AI-assisted tools in its work, describing the categories of use, and obtaining client acknowledgment. This serves two purposes: it satisfies the disclosure and consent dimension of ET Section 1.700, and it gives clients the opportunity to raise concerns before AI-generated work product is delivered to them.
The disclosure does not need to be complex. A paragraph in your engagement letter stating that the firm uses AI tools for specific categories of work (described at an appropriate level of detail), that client information is handled in accordance with the firm's privacy and security policies, and that AI-assisted work product is reviewed and approved by a qualified professional before delivery is typically sufficient.
Firms that address this proactively in engagement letter templates are better positioned than those that address it reactively after a client raises a concern.
Build Your Firm's AI Governance Program
Axiom Sovereign deploys AI governance programs for CPA firms — survey, inventory, policy, training, and vendor agreements. Most implementations complete in 60 days.
Download the AI Governance Checklist Schedule a Discovery Call