Skip to main content

Teaching the next generation of CPAs to think with AI

May 05, 2026

Chris Dahlvig, CPA, MBA
Associate Professor
Linfield University, McMinnville, Oregon

Ask any accounting student why they are taking a course on AI, and you will hear something that might surprise you. “I’ve always thought of AI as a tool,” one of my students told me, “and I think staying in the know about it is important to stay adaptable in today’s world.” That student understood something that every CPA already knows: the profession rewards those who prepare for what is coming, not just those who master what is already here.

Another student put it even more directly: “It is no secret that the industry will be integrating AI into everyday tasks. Simply having the background knowledge and applied skills of diverse AI models will help give us as staff accountants an edge.” These are not students who are afraid of AI displacing them. They are students who have decided to get ahead of it.

What strikes me most about this generation of accounting students is their clarity about the difference between using AI as a shortcut and using it as a genuine thinking partner. “A lot of people, especially young people, are using AI to replace their own thinking,” one student observed. “We are being trained to use AI models to enhance our thinking, and further our understanding of topics with the help of AI.” That distinction between replacement and enhancement is exactly why I built BNSS 486: Mindset for the AI Age.

It is also why what is happening in university classrooms right now matters enormously to the future of our profession.

Why a course like this? Why now?

I have been an accountant long enough to remember when spreadsheets were considered disruptive technology. Every major shift in our profession like tax software, cloud accounting, and data analytics has come with the same question: will this replace us? The honest answer has always been: not entirely, but it may replace those who are not prepared.

Generative AI is different from other disruptive technologies, but not in kind. And right now, the students sitting in our accounting classrooms enter a profession where tools like ChatGPT, Claude, Microsoft Copilot, Google Gemini, and purpose-built platforms like CaseWare AI and Vic.ai are already embedded in audit workflows, tax research, financial analysis, and client communication. The question is not whether they will use these tools. They will. The question is whether they will use them wisely.

That is what drove me to design this course. It was not so much about learning the technology, but more about mindset, judgment, and professional responsibility in an AI-rich world. 

What “mindset” actually means

When I tell people I am teaching a course called “Mindset for the AI Age,” the first question I usually get is: “So you’re teaching them how to use ChatGPT?”

Not exactly. We do use it.  We also use Claude, Copilot, Gemini, and specialized accounting tools like NotebookLM, Base 44, HeyGen, Nano Banana, Suno, and ElevenLabs. But the course is less about the tools themselves and more about how to think alongside them, and how to tell a better story with them.

The course draws on three required books: Mollick’s (2024) Co-Intelligence, Wood’s (2025) Rewiring Your Mind for AI, and McCauley’s (2024) How to Think With AI. Together, they frame the concept of co-intelligence. It is an intentional and iterative partnership of human and artificial intelligence. 

I designed the course around six roles that an AI-ready professional needs to inhabit: Learner, Problem Solver, Researcher, Connector, Storyteller, and Synthesizer (see: (https://www.ascd.org/blogs/profile-of-an-ai-ready-graduate). Each one represents a distinct competency that matters in accounting practice, and each one is something AI can augment but not replace.

For CPAs, this framework translates directly. When you are auditing a client’s revenue recognition and using AI to scan contracts for anomalies, you are not abdicating your judgment.  Instead, you are acting as a Researcher, using AI to extend your reach while you supply the professional skepticism the AI cannot. When you’re helping a small business owner understand their cash flow forecast, and you iteratively use AI to generate a plain-language summary for a client who finds financial statements intimidating, you are being a Storyteller. These are skills our students need, but none of us were explicitly taught.

Where ethics and professional judgment live in the curriculum

This is the part I want every CPA reading this article to pay close attention to, because it’s where the course gets uncomfortable in a good way.

Because agentic AI utilizes probabilities based on massive data sets originally created by humans, it is designed to find an answer, even incorrect or problematic answers. At this point, AI systems hallucinate. They produce outputs that are fluent, confident, and completely wrong. They can reflect biases baked into their training data in ways that are subtle and hard to detect. They raise serious questions about client data privacy, intellectual property, and sources of information.
In a profession built on trust, accuracy, and independence, these are not minor footnotes. They’re existential concerns.

So the course bakes ethics into every major assignment. In the “Become an Expert” project, where students conduct deep AI-assisted research on a topic of their choosing and produce a 5-page paper with 20 peer-reviewed sources, the grading criteria explicitly reward students who use AI as a question-asker and thinking partner, not as an answer machine. Students have to verify and interrogate every AI output and demonstrate their iterative partnership with AI.

In the two “Expert Interview” assignments, one with a subject-matter expert and one with a practitioner, students must compare what AI told them to what a human expert actually says. The explicit learning goal is to discover the gaps: where AI was helpful, where it was misleading, and where human judgment was irreplaceable. 

What I have watched students internalize through this process is something I think practicing CPAs need to hear: the greatest risk of AI is not that it will take your job. It will instead make mediocre work look polished, and professionals will not catch it.

That is an issue for our profession, and it is a curriculum issue.

How the course is structured

The course is built for junior and senior business students — exactly the students who are twelve to eighteen months away from graduation. Most of them have been using AI tools informally for a couple of years. Very few have ever been taught to use them intentionally.

The course begins by building shared language: what are large language models, how does probabilistic reasoning work, and what does it mean when an AI “hallucinates?” This practical orientation helps students engage critically with these tools rather than just consume outputs. There is even a group quiz called “The Wait…So That’s How It Works?!” quiz, which is intentionally low-stakes and designed to resolve misunderstandings in a safe environment.

Students write daily reflection blog posts throughout the semester. These brief, honest entries capture their evolving thinking. Over the semester, I am reading how a student’s relationship with AI shifts from novelty to skepticism to something more nuanced: a kind of informed, critical fluency. That arc is extremely satisfying to read. This metacognitive reflection component at the end of each class helps students practice reflection as an essential part of the human learning process to increase their self-awareness and critical judgment.

The course culminates in a presentation where students present their full research journey from initial inquiry to expert-tested solution.  They incorporate audio, video, and infographic components co-created with generative AI. The final grading rubric awards points specifically for “intentional, transparent, and value-added collaboration with GenAI rather than surface-level use.” That distinction between using AI as a crutch and using it as a genuine thinking partner is the through-line of the entire course.

What this means for the profession

I want to close with something direct: the students coming into your firms in the next few years will have taken courses like mine. Some will not.

That gap matters. Because there is a meaningful difference between a new associate who has been trained to treat AI outputs as a starting point for critical analysis, and one who treats them as a finished product. One has been taught that AI cannot replace professional judgment, and that the CPA’s job is precisely to supply what AI cannot: context, accountability, ethical reasoning, and the ability to look a client in the eye and say, “I stand behind this.” The other has been handed a powerful tool without the training to use it responsibly.

Our profession has always been self-regulating in the best sense of that phrase. We hold ourselves accountable to standards that protect the public, not just our clients. As generative AI reshapes what accounting work looks like, that responsibility doesn’t diminish. It intensifies.

I would encourage every CPA in Oregon to ask two questions of the firms, schools, and professional development programs they are connected to: What are we doing to build AI literacy that goes beyond tool adoption? And who in our organization has been given explicit training in how to catch AI-generated errors before they become our errors?

The answers to those questions will tell you a lot about how ready your organization is for what is already here.

Back to those students

My students are telling us something important, and I think the profession should listen. They understand that adaptability is now a foundational skill that sits alongside the technical accounting competencies they have always been taught. As one student put it, employers “may have qualifications they are looking for outside of what they traditionally teach you in college” and proficiency with AI is quickly becoming one of them. Knowing how AI works, this student noted, will help you grow more efficient and may help cut the monotonous parts of accounting out, freeing you to work on challenges requiring more human intuition.

That is not just student optimism. It is a clear-eyed read of where the profession is heading. What gives me confidence is that these students are not looking to hand their thinking over to a machine. Instead, they are looking to sharpen it. One student described the course’s purpose with more precision than I could have: we are being trained “to use AI models to enhance our thinking, and further our understanding of topics with the help of AI.” Not to replace judgment. To strengthen it.

The students entering your firms in the next few years are already thinking about this. The question for our profession is whether we meet them there with the culture, training, and standards that let that potential flourish, or whether we leave them to figure it out on their own.

This next generation is already reaching for co-intelligence. And I believe, with the right curriculum and the right culture in our firms, we can get there together.

Chris Dahlvig is a faculty member at Linfield University, where he teaches BNSS 486: Mindset for the AI Age. He holds a CPA license and an MBA, and his work focuses on preparing business students to lead with judgment in an AI-enhanced profession. Chris is also a current member of the Board of Directors of the OSCPA.  He can be reached at cdahlvig@linfield.edu.