Tax numbers, social security numbers, net income, etc. CPAs manage a tremendous amount of valuable information for themselves and their clients. Keeping it safe is a serious responsibility.
The field is increasingly turning to artificial intelligence to assist with data management and security, but paradoxically this technology itself can also pose security risks. How can a CPA practice use AI tools effectively while remaining accountable for client information that cybercriminals regularly attempt to access? That’s where the Federal Trade Commission’s Safeguards Rule of 2023 comes into play.
Using AI to streamline operations
While AI can perform mundane tasks such as composing emails and providing customer service through a chatbot, its greatest value lies in processing large amounts of information and making it accessible to humans.
Artificial intelligence has countless
When complex tax issues arise, AI can conduct detailed legal research to identify relevant laws and regulations. It can be used to automate tax returns. The list is essentially endless.
Risks to be aware of
However, a tool as powerful as AI comes with risks. One of the biggest risk areas associated with AI in accounting is confidentiality. Information being processed, analyzed, summarized, and the like becomes subject to the AI tool’s own cybersecurity issues. Users must weigh the value of using AI for a particular application against the possibility of exposing sensitive information.
Users should also remember that AI is not infallible. It has been shown to produce results that are
Responsibilities under the FTC Safeguards Rule
As a business that stores personally identifiable information about its clients, a CPA practice must follow federal regulations regarding cybersecurity. In the area of cybersecurity, the Federal Trade Commission has jurisdiction over what it defines as financial institutions, i.e.:
Forming the foundation of an accounting firm’s cybersecurity system is one
Experts provide some tips to help make the transition to AI.
- Adoption does not have to happen all at once. Practices can try AI piecemeal, using it for one application and then adding more as staff get used to it. Products from different AI providers can be tested and compared.
- Using clean data is crucial. AI can’t create good reports from bad data, so it’s important to follow good data management practices.
- Training is key. AI is constantly changing and to make the most of it, employees need ongoing training.
Balancing the rewards and risks behind AI tools is critical. Use the Safeguard Rules as a guide to ensure FTC compliance and risk management.