fbpx

JoCo government has started working with AI. New policy lays out how staffers can use it responsibly

Emergency dispatchers, the district attorney's office and financial planners are among the groups testing AI tools in their work

Cheery chatbots offering help with property valuation appeals or tax bills won’t be popping up on the Johnson County website any time soon.

But artificial intelligence is beginning to seep into the county’s administration in various other ways — enough so that commissioners and county technology experts recently began crafting a policy for how it will be used and overseen.

So far none of those uses is public-facing, says Bill Nixon, director of the county Department of Technology and Innovation. Some county employees do use ChatGPT for their own information-gathering purposes, but no bots interact with the general public on the website.

Nixon doesn’t rule that out in the long term. But for now, Nixon said, AI is for internal efficiencies only because of concerns about accuracy.

That said, the county has begun to use the relatively new technology in several ways:

  • An AI program contributes to training for county dispatchers. An artificial intelligence program has been used to simulate 911 calls and help train dispatchers on the right questions to ask in real-life situations, he said. AI also has been used at the dispatch center to enhance calls, listening for “trigger words” like “gun” to highlight.
  • The District Attorney’s office uses a closed system to help prosecutors keep track of emails and responses from witnesses, for example. “This is saving them some time,” he said, because doing it manually is time-intensive.
  • This year county administrators began using ResourceX, a budget-analyzing program for governments. The use is intended to help officials find ways to streamline the budget at a time when resources have been tight.
Clinicians at the Johnson County Mental Health Center would use Bells AI, a Netsmart product, to help with billing paperwork and progress reports.
Clinicians at the Johnson County Mental Health Center would use Bells AI, a Netsmart product, to help with billing paperwork and progress reports. Photo illustration credit Leah Wankum.

Mental health department to use AI

One of the newer uses for AI in the county will be at the Mental Health Department. The county has purchased but not yet implemented a program that will help clinicians there with the progress reports and billing paperwork.

Bells AI, a Netsmart product, is a medical documentation program that checks that clinicians have included the required information and appropriate coding for billing. Getting that information wrong on a bill to Medicaid, for instance, could result in Medicaid officials clawing back the payment, said Tanner Fortney, assistant director of the Mental Health Center.

The commission approved that purchase in September for $91,000 plus a one-time set-up charge of $18,600. During that discussion, Fortney told commissioners that the program has the potential to save money, but the bigger advantage will be in the time it may save for interaction with patients.

Based on the results from other agencies that have used it, Fortney said, the time writing notes could drop from 18 minutes to 11.5 minutes per note — time that could be better spent serving clients.

“That’s very, very important given the fact that we have continued high turnover rates and very high caseloads as a result,” Fortney said.

Implementing AI into county operations comes with its concerns, though. Commissioner Becky Fast, a licensed mental health provider, said there’s a potential for error, even when a program is only looking over notes.

Fast referenced an incident she’d heard about in which a clinician wrote that a client had “sniffled,” but the AI program changed that to “was distraught,” completely changing the character of the interaction and possibly the diagnosis.

Informed consent and transparency are also important for the clients, she said. One of Fast’s big concerns is the use of therapy chatbots that may or may not have a licensed human being overseeing them. The county is not using those programs, but Fast said she’d like to make sure the county tells people when artificial intelligence is involved in their experiences with government services.

What’s in the policy?

Transparency is a big part of the new AI policy, along with ethics, data protection, human oversight and workforce support, Nixon said.

“A lot of AI tools, you can think of it like a spell checker or grammar checker on steroids. It’s rephrasing and rewording things,” Nixon said. “That’s why we want to implement this policy, to make sure there is human insight so if AI does rewrite notes, a human being still needs to look at that to make sure the intent of the note didn’t change. It’s not doing diagnosis, but if it rewords a sentence to say something incorrect, it’s still going to require review.”

Currently, the Mental Health Department has a billing code specialist come in and do random checks for errors. That type of human oversight would continue, Fortney said.

The policy lays out general principles for oversight and transparency but specifics are coming later, after more discussion and review, Nixon said.

A big concern — and one of the reasons county officials have been hesitant to implement chatbots — is the tendency for AI to give wrong answers, Nixon said.

Those are called hallucinations. Nixon said humans sometimes do the same thing. “You ask a human being a question they might not know the answer to and they might make something up. AI does the same thing and that’s where we have to use caution,” he said.

Stories about incorrect answers from other governments that have used chatbots have prompted some concerns, he said. So officials will have to find a way to keep the chatbot from going off the rails, as X’s Grok AI feature did one day last week when it answered every question put to it with a screed about “white genocide” in South Africa.

That problem was reportedly caused by instructions inserted by a “rogue employee.” Nixon suggested one way to deal with hallucinations might be to include code that makes the program biased toward not providing an answer rather than making up an incorrect one.

“That’s some of the internal discussions we’ve been having. You cannot trust the output from AI. You have to independently validate it,” said Nixon.

Johnson County Mental Health staff is planning to use an AI tool to help clinicians with progress reports and billing paperwork.
Johnson County Mental Health staff is planning to use an AI tool to help clinicians with progress reports and billing paperwork. Photo illustration credit Leah Wankum.

Vetting the vendors

Officials will also have to be careful about data safety, he said.

With all of the sensitive personal information in the mental health department, administrators will have to be especially mindful of privacy concerns. “That’s why we have a governance process around that,” he said.

Data about health information is in a closed loop that stays within the health department environment and is not shared with other areas, he said. There are also security controls in place to ensure the data can’t be breached or stolen. In fact, Nixon said anything involving health information, financial information or criminal justice will require more of a closed system that can be controlled, and that will be written into the contract with vendors.

Along with that is vetting companies themselves. The county is using one program, ResourceX, that was recently acquired by Tyler Technology. That larger company also provides other software to the public sector, including the Odyssey system used by the Kansas Judicial Branch and now by Johnson County’s 10th District Court.

The Kansas Judicial Branch underwent a ransomware attack that shackled the online court operations for months in 2023. The Odyssey system was also targeted in an attack in California the previous year.

Nixon said he’s aware of the security breaches that Tyler, like other tech vendors, has had. But since most of the data ResourceX is analyzing is publicly available, it wasn’t a significant concern.

However, vendors that are being considered for more sensitive personal data will have to meet a higher bar than those only looking at publicly available material, he said.

AI and the workforce

Special concerns also will be in place for AI as it relates to employees, he said.

The use of AI to eliminate jobs was one concern brought up during the discussion about AI in the mental health department, but commissioners were assured that the preference was for it to allow a higher use for clinicians’ time.

The policy addresses it by saying, “AI should enhance, not replace human decision-making.” Nixon said AI may change or even eliminate some aspects of a job, just as it has done in the past for typists. However, the county’s policy is to address that by having department and office heads work on retraining and redeploying those employees to other positions.

The county also would not likely use AI tools to review resumes or evaluate employees because of concerns about bias, he said. “To be clear, we’re not doing any of that, but that is part of the ethical discussions we are going to have.”

About the author

Roxie Hammill
Roxie Hammill

Roxie Hammill is a freelance journalist who reports frequently for the Post and other Kansas City area publications. You can reach her at roxieham@gmail.com.

LATEST HEADLINES