Artificial Intelligence & Plastic Surgery
December 20, 2022
January 8, 2023
.
By Dr. Andrew Campbell-Lloyd

Artificial Intelligence & Plastic Surgery

I love a bit of future tech. I have no doubt that AI and robotics have huge roles to play in healthcare. Unfortunately for those of my colleagues in specialties which largely rely on pattern recognition and image assessment (dermatology, radiology, pathology as examples), I suspect that AI will surpass the capabilities of humans very rapidly (indeed, it already has, it's just that AI has even worse bedside manner than specialists).

For those of us in the "technical" specialties, well, humans have a role to play for a good while yet. We don't have robots that can perform surgery without a human driver, and that is unlikely to happen anytime soon given the required combination of image processing, mechanical feedback and rapid hand-eye coordination that occurs in every operation. But eventually, I suspect I'll be replaced by some neat bit of kit with a bloody hard drive.

Anyway, the future that we are looking at right now is exciting. AI is a really fascinating field and whilst I don't pretend to understand the intricacies, I suspect we'll be spending a lot more time talking to our computers in the next few years than we do typing into them, or clicking with a mouse. And given that I am old enough to remember when computers didn't even have a mouse, I for one am looking forward to that evolution.

Speaking of AI, no doubt people have heard recently about chatGPT. This is the really amazing new model from OpenAI of a natural language chat bot. The reason it has hit the news in a major way recently is because you can type in a question, and in the space of a few seconds, it will spit out an essay, an exam answer, a newspaper article, a bit of website coding, or the answer to a scientific query. It has raised concerns about its use by students, journalists and others, given its remarkable ability to generate responses that are factually and idiomatically appropriate. How long before this kind of technology becomes so insidious as to be undetectable?

My curiosity piqued, I decided to play around with this thing to see what it can do. The model is imperfect and is still learning, but I decided to see what the AI could tell me about my own field. So I asked it a few questions like "what are the options for breast reconstruction after mastectomy?", and "how does fat transfer to the breast work?". These are specialist areas and technical questions to boot, and it was really interesting to see the generated responses.

The responses were basically text book. Boring, unoriginal, but factually correct and presented in much the same way as countless authors have written in various text books, review papers, and indeed in the plastic surgery fellowship exams. So far so interesting. It showed that chatGPT could generate accurate responses to technical questions pertinent to plastic surgery. That is pretty cool - it means the AI can drag up the knowledge it needs and contextualise it to answer clinical queries.

This is really significant because if you, as a patient, wanted a bit more information about whatever procedure you were researching, chances are you could generate a useful reponse to your question. Chances are the response would be better than what a lot of surgeons would provide!

But I decided to keep playing around and so I asked the AI another question as a bit of a joke really: "can you write a promotional piece for a plastic surgeon?".

I wasn't sure what to expect, but what came out has really made me think, for a number of reasons. And it also made me a bit sad about the nature of my profession.

One of the things that has always bothered me about the online presence (websites, social media) of plastic surgeons is the inauthenticity and the homogeneity of them. There are very few websites that actually sound like the surgeon had anything to do with writing the content, and they all kind of sound the same. Now that is fine becuase it allows someone like me, who creates all of their own content, to ensure that our online presence sounds authentic and reflective of my personality, and I am pretty sure that the stuff we have online doesn't sound like any other surgeon about the place. Which I suspect helps us stand out a little.

But now we have this AI chat bot and the content it is generating is....fascinating. Let me show you two responses it provided to the question I asked above.

Now, I'm not going to be a jerk about this, but without trying too hard I reckon I could find dozens of websites from surgeons in Australia in about 5 minutes, with content that reads almost identically to what you can see above. Boring, cookie-cutter, unimaginative, no personality, and endlessly repeated on different surgeon's websites. And I suspect that any patient who has spent even a short time considering surgery and who has tried to do some research by reading surgeon's websites will recognise the patterns of speech, the not-so-subtle hyperbole and cliche-ridden content.

The fascinating thing about this is not what it tells us about artificial intelligence, natural language processing or machine learning algorithms, although without a doubt these are topics well worth digging into. The fascinating thing is what this AI chat bot tells us about plastic surgeons.

This is a deeply revealing use of nascent technology. The crux of it is this: if a plastic surgeon's website/social media reads in a way that it sounds like an AI, the implication is that the content of that online presence is neither original nor authentic to a surgeon's personality.

This AI has been trained using reinforcement models that have taught it how to behave. It doesn't think for itself (YET!). It doesn't necessarily have the ability to create original content (although there are some examples of it kind of doing just that which is wild). It is basically using its knowledge of learned responses to generate another response. Humans on the other hand have the ability to create truly original content that reflects the creator's personality. Sadly, very few surgeons have an online presence that does reflect their personality, primarily becuase most surgeons contract out the creation of their website and social media content to third parties. Those third parties (marketing agencies) have a tendency to use certain keywords, catch phrases and patterns of speech which lead to the homogenisation of website copy.

So perhaps a new metric that potential patients need to apply when appraising online content is this: if a website or social media post reads like my little friend chatGPT could have written it, then how likely do you think it is that a surgeon actually created that content themselves? Pretty bloody unlikely is my guess. And if surgeons are farming out content creating to marketing agencies and other third parties, then how authentic, and more importantly how trustworthy can that content be?

The future is bright. Skynet anyone?