Picture this: It’s 2025. ChatGPT is diagnosing complex health conditions and supporting physicians in ways that even the best binge-watcher of Grey’s Anatomy couldn’t manage.

In all its eagerness to do what it does, this advanced AI might occasionally step on the toes of data privacy or misinterpret medical images, leading to a wrong plan of care, or even death.

We won’t likely see AI robots knocking on our doors threatening world domination anytime soon, there’s always that wacky sci-fi movie possibility of AI outsmarting us and destroying civilization.

Is AI comparable to nuclear energy?

Used properly, it can power our homes and hospitals almost endlessly. However, nuclear weapons can annihilate every living creature on earth. Not cockroaches, of course.

That’s why nuclear is one of the most highly regulated industries globally, and why many countries won’t (or can’t) touch it with a six-foot pole.

Regulators to the rescue?

In a Congressional hearing last week, ChatGPT chief Sam Altman suggested advanced AI technologies be regulated by the US or even a global entity. Cynics will cry “regulatory capture”, the playbook of industry incumbents. Free-market fans will bemoan a potential deathblow to further innovation.

The biggest challenge regulators face is figuring out what’s going on – things are moving at such a fast clip, that even the people who are experts in the field are having a hard time catching up. How will politicians and civil servants, the vast majority who aren’t tech literate, understand what needs to be done?

Human guardrails, at least for now

There are so many opportunities to improve healthcare right now, that it makes no sense to wait for regulatory cover.

That said, leaders in the industry have to be careful when adding ChatGPT to their quiver of medical services – a missed diagnosis or wrong medication could be the difference between saving a life or ending one.

At this point in time, it makes sense for AI to serve as a guide, or co-pilot for human caregivers, be they doctors, nurses, or other healthcare practitioners.

Companies like Twig are implementing generative AI such as ChatGPT to turbocharge human efficacy when interacting with patients; however, a licensed healthcare professional always makes the final call.