ChatGPT and other generative AI chatbots are being used for almost everything these days, from writing emails to planning a vacation or even to help with filing taxes. That doesn’t mean it’s always a good idea.

Imagine this hypothetical scenario: Lorin, a 42-year-old market researcher who has a side hustle selling products on eBay, hates doing his own taxes and doesn’t want to spend money on a tax professional. So he decided to ask ChatGPT for help with his tax return.

He wouldn’t be the first to turn to AI for help with financial matters.

One in three (33%) Canadians are using AI to manage their finances, according to a 2024 Ipsos poll conducted on behalf of BMO (1). While 45% are using AI to learn more about personal finance, they’re also using it to create household budgets and update financial plans.

While it’s not clear if (or how often) they’re using AI for help with their taxes, a study in the U.S. found that more than two in five (43%) of Americans would trust AI to file their taxes over hiring a tax professional (2).

But should they? In Lorin’s case, ChatGPT said it could get him a $12,000 tax refund. Here’s what Lorin may want to consider before he submits his return.

Don’t Miss

The pros of using AI for taxes

Let’s face it, the vast majority of people don’t enjoy doing their taxes. Some downright dread it. So it’s no wonder more people are turning to ChatGPT or other AI chatbots for help.

AI could be useful for certain tax-prep tasks, such as helping you identify common deductions or tax credits. It could also be useful for flagging inconsistencies or missing information that requires further investigation on your part.

While AI can also be used to answer tax-related questions, the information you receive may not be up-to-date. For example, it might not catch newly announced tax deductions.

Relying fully on AI without verifying the information yourself (or with a tax professional) could lead to inaccuracies or even false claims.

The cons of using AI for taxes

ChatGPT doesn’t know your “filing status, deductions, retirement goals or risk appetite,” tech journalist Nelson Aguilar writes in CNET (3). “Because its training data may stop short of the current tax year, and of the latest rate hikes, its guidance may well be stale when you hit enter.”

Plus, it might deliver answers “that are biased, outdated, or just plain incorrect, all while sounding like a PhD,” he writes.

That phenomenon is referred to as ‘hallucinated’ data. OpenAI, the company behind ChatGPT, describes hallucinations as “plausible but false statements generated by language models (4).”

This happens because AI doesn’t ‘understand’ information; rather, it generates responses based on patterns in training data, making ‘guesses’ to fill in gaps. Also, some of that training data may be biased, inaccurate or incorrect. But when the AI provides a response, it sounds authoritative, even if that information is fabricated (5).

That’s where another phenomena comes in: ‘sycophancy,’ which is the “tendency of AI models to adjust their responses to align with users’ views” and “can make ChatGPT and its ilk prioritize flattery over accuracy,” according to Axios (6).

One study found that AI assistants will sometimes modify accurate answers in favour of those “that match user beliefs over truthful ones (7).”

Another major risk is to your privacy. An AI could use your personal data to train models; it could accidentally share sensitive information in outputs or expose it through data breaches. It’s advisable to never share personally identifiable financial, health or other sensitive information with a chatbot.

Read more: Here are 5 expenses that Canadians (almost) always overpay for — and very quickly regret. How many are hurting you?

What to keep in mind if you use AI for your tax return

If you have a simple return, an AI can help you automate data entry and identify legitimate deductions. But it can also hallucinate data.

An AI isn’t necessarily great at understanding your personal tax situation and accounting for context, especially with complex returns. And it may misinterpret complex tax laws.

Say, for example, you sometimes work from home. The AI might tell you that you qualify for a home office deduction when you don’t. Or, it might tell you that your side hustle losses are fully deductible when they aren’t. If AI misclassifies deductions, you’re responsible if Canada Revenue Agency (CRA) flags it.

The prompts you use could make a difference, too. For example, if Lorin prompts the AI to reduce his taxes, ‘sycophancy’ could come into play in which the AI finds deductions even if they aren’t applicable to his situation.

Ultimately, you’re responsible for the accuracy of your tax return. And inaccuracies can be costly, particularly if you end up underreporting your income or overstating your deductions.

If the AI provided you with fabricated information, then you could end up unintentionally committing tax fraud — a serious offense that can come with fines or jail time. Inaccurate or incomplete information could also increase your risk of being audited.

That’s not to say you can never use AI in the tax-prep process. Even tax software companies are building AI chatbot assistance into their products (though they tend to be limited in functionality).

But you should always double-check your numbers and verify the truthfulness of the information. Basically, don’t take AI at its word.

What To Read Next

Article sources

We rely only on vetted sources and credible third-party reporting. For details, see our editorial ethics and guidelines.

Ipsos (1); Invoice Home (2); CNET (3); OpenAI (4); coursera (5); Axios (6); Cornell University (7)

This article provides information only and should not be construed as advice. It is provided without warranty of any kind.