Opinion

Dr. AI & Me

When we were teenagers in the 1980s, my sister brought home our family’s first desktop computer. I thought it was ridiculous. Nothing was going to replace my fancy electric typewriter.  So, I never bothered to take the computer seriously. “This is just a fad, sure to fade away,” I assumed. Boy was I wrong. I’ve been catching up ever since. Nothing about computers and related technology comes easily to me.

As a medical student in the late 1990s, my classmates were buying the PalmPilot, the first personal digital assistant in the form of a handheld computer. I felt I had no choice but to get one, too. I forgot to recharge it one day and all my information was lost. I quickly resorted back to “paper and pen,” my comfort zone. To this day, I find myself with lower than average proficiency on my smartphone and I’ve resisted buying a smartwatch.

No surprise that I was caught completely off guard when social media platforms hit the scene in the early 2000s. I was the only person I knew without a Facebook account. When my daughters embraced social media with gusto, I wasn’t able to protect them because of my technological ignorance.

I’m now in my mid 50s and artificial intelligence (AI) has “taken over” the news. I know it won’t fade away. I know, as my history suggests, that I risk being caught off guard. Once again, like the experience with my daughters, I risk being unable to protect my patients from the abyss of AI.

The rapidly evolving nature of AI worries me … will my career be threatened? Will I be “put out to pasture?” I surprised myself when one day I decided to sign up for a free AI system, ChatGPT.

I am a palliative care physician. I care for people with progressive, life-limiting illnesses. My patients are often scared and “in the dark” about the reality of their situation. I spend hours helping them to understand the “big picture” of their illness, what to expect and how things will unfold. I invite them to know more about dying, how to prepare for the final chapter of life and offer them their prognosis. It requires me to have sophisticated communication skills, to be able to read the room (the non-verbal cues) and to untangle how much of their physical suffering is amplified by other sources of suffering like existential angst. This is the special skill of a palliative care clinician that I hope sets me apart from Dr. AI.

I attempted to garner personal advice from Dr. AI, but the advice was limited to generalizations.

For example, I picked a common progressive lung condition called COPD (chronic obstructive pulmonary disease) to search in ChatGPT. Pretending to be a patient, I got useful information about what it is, its stages and what to expect with COPD. In addition, it offered me some comforting thoughts when I asked it to console me about my fears of dying from COPD. I attempted to garner personal advice from Dr. AI, but the advice was limited to generalizations.

All in all, I was impressed. If I were someone with COPD, I could learn a lot from Dr. AI. Dare I suggest that Dr. AI does a better job at describing the realities of COPD than many of my physician colleagues who avoid discussing its progressive nature with their patients. Much like doctors who ease difficult conversations by offering exaggerated hope, Dr. AI had a tendency to summarize the information with optimistic statements such as “with medical advancements, proper treatment can prolong survival.”

However, I am confident that Dr. AI will not replace me. Granted, it is informative and somewhat comforting, but Dr. AI could not individualize or contextualize the information. Each and every person with COPD (or any other progressive, life-limiting illness) is as unique as a snowflake, a fingerprint or a sunset. There is an art to being a doctor grounded in the interpersonal aspects of caring and the ability to customize and provide holistic care to the individual.

That’s what I do as a palliative care physician. I humanize the illness experience one patient/family at a time. And at the end of a visit, I make sure that the person understands what we have discussed. Before leaving, I once again read the room, including the non-verbal. I am often thanked for my honesty and willingness to gently communicate without sweetening the reality of the situation. Their gratitude is often sealed with a hug. I use multiple human senses to heal even when I can’t cure, such as eye contact, listening and touch.

My conclusion about AI is that it is here, it will stay, and it will evolve like all technology. But this time, I have not “buried my head in the sand” and for now am delighted that this doctor’s career is secure and her patients are well cared for.

Leave a Comment

Your email address will not be published. Required fields are marked *

Authors

Samantha Winemaker

Contributor

Dr. Samantha Winemaker is an associate clinical professor at McMaster University in the Department of Family Medicine, Division of Palliative Care, and has held multiple leadership roles including McMaster Postgraduate Curriculum Lead, Hospice Medical Director, Regional Palliative Clinical Lead and Medical Lead Palliative Care Outreach Team.

Republish this article

Republish this article on your website under the creative commons licence.

Learn more