Are We Training Firefighters or AI Dependents? A Fire Chief’s Warning on the Risks of Overrelying on ChatGPT

Gregory McComb raises the red flag of concern that if firefighters and officers are allowed to offload their thinking onto an AI platform, the fire service is at risk of a degradation of competence.
Oct. 3, 2025
5 min read

Key Takeaways

  • If fire chiefs aren't careful about how their members utilize AI platforms, the cognitive skills that make firefighters effective could erode.
  • New firefighters who use AI platforms to write narratives, interpret policies and study for an exam minimize, even eliminate, their critical-thinking skills.
  • Fire chiefs must set clear expectations on how and when AI platforms should be used by members.

Artificial intelligence isn’t coming to the fire service; it already is here. From automated report writing and preplanning to grant application support and drafting standard operating procedures, AI platforms, such as ChatGPT, rapidly are becoming a fixture in departments across the country. Used wisely, they’re powerful time-savers. However, as fire chief, I feel compelled to raise a red flag: If we aren’t careful, these tools slowly could erode the very cognitive skills that make our personnel effective in the first place.

We’re at a crossroads. One road leads to enhanced efficiency. The other, if we aren’t intentional, leads to a workforce that leans so heavily on AI that members stop thinking for themselves. In the fire service, the moment that someone stops thinking is the moment that someone gets hurt or worse.

Quiet shift, big consequences

My concern hit me after I read an article that noted that someone received an email that the individual didn’t understand. Instead of asking for clarification or trying to interpret it, the person uploaded it to ChatGPT, and the platform instantly translated it into something that the individual could grasp.

That might sound like a success story, but let’s think critically: Why didn’t the person understand the email? More importantly, why didn’t the individual try to understand it? Instead of learning through effort, the person bypassed the struggle entirely. This is the slippery slope, an overreliance on AI not just for convenience but for comprehension.

In the fire service, we don’t have the luxury of outsourcing our thinking. From the fireground to the front office, every decision that we make carries risk. Every delay in judgment has consequences. If firefighters and officers begin turning to AI habitually for interpretation, explanation or even operational advice, we no longer are developing decision-makers; we’re developing mouthpieces for machines.

Illusion of intelligence

ChatGPT and other AI platforms are incredibly convincing. Their answers are clean, well-written and confident, even when they’re wrong. That’s the problem. They present information with such authority that users often assume that they’re correct, bypassing their own critical thinking. This is particularly dangerous for younger personnel who might lack the experience to spot when something “just doesn’t seem right.”

Imagine new firefighters using ChatGPT to write all of their narratives, to interpret policies or to study for an exam. The members might appear competent on paper, but without engaging the material themselves, their depth of understanding is shallow at best. We risk creating firefighters who know how to ask the right questions but not how to think through the answers.

Fireground dependency is a nonstarter

AI platforms are best used in low-stakes, administrative environments. The fireground isn’t one of them. Yet the mindset that “ChatGPT will just tell me what to do” can bleed over from the office to the incident scene.

I dread the thought of a future company officer stalling on a key decision because that individual grew accustomed to AI doing the hard thinking. There’s no algorithm for a smoke condition that doesn’t match a size-up. There’s no substitute for gut instinct that’s honed by experience when a flashover is seconds away. There’s no AI that can make a life-or-death decision on your behalf with accountability.

Fire chiefs’ No. 1 job is readiness. That includes mental readiness and the ability to assess, adapt and act under pressure. AI can’t teach that. Only hard training, mentorship and lived experience can.

Death of institutional knowledge

One of the greatest dangers of overrelying on AI in the fire service is the quiet erosion of institutional knowledge. Departments pass wisdom from generation to generation, through war stories, tailboard lessons and mentorship. That transfer is personal, not digital.

If younger members bypass their officers and mentors and instead feed their questions into an AI prompt window, we lose something essential: culture, context and the nuance that AI simply can’t replicate. The fire service isn’t built on data alone; it’s built on people who’ve been there and can guide others with the scars to prove it.

Leadership must set the tone

If fire chiefs lean too hard on AI platforms to write policies, evaluations or communications without understanding or reviewing them, they set a precedent: speed over substance, convenience over cognition. That will trickle down.

Fire chiefs must set clear expectations on how and when these tools should be used. Make it known that, although it can assist, AI never should replace critical thinking. Build training scenarios that prioritize decision-making under stress. Encourage officers to challenge their crews, not to just give them answers. Promote a culture where curiosity, problem-solving and hands-on learning still are king.

At my department, we embraced AI in smart, controlled ways, to draft memos, summarize long reports and even clean up narratives for consistency. However, we make it clear that these are tools to support thought, not substitute for it. Our crews are expected to know the material behind the words.

Final word of caution

ChatGPT and other AI platforms are here to stay. They aren’t inherently bad but quite the opposite. When used responsibly, they free up time, reduce errors and enhance communication. However, if we allow members to offload their thinking onto AI, we will see a slow but steady degradation of competence in firehouses, and in this job, competence is life.

It’s on fire chiefs to guard against that outcome, not by banning technology but by leading its integration with clarity and purpose. Use these tools to elevate departments, not erode them.

In the end, AI only is dangerous when it replaces real intelligence. Make sure that never happens.

About the Author

Gregory McComb

Gregory McComb

Gregory A. McComb is the fire chief of Oshtemo Township Fire Department in Kalamazoo County, MI. With more than three decades in the fire service, he leads a growing combination department and serves on multiple regional boards, including the Kalamazoo County Fire Chiefs Association.

Sign up for Firehouse Newsletters

Voice Your Opinion!

To join the conversation, and become an exclusive member of Firehouse, create an account today!