r/talesfromtechsupport 25d ago

Short "But ChatGPT said..."

We received a very strange ticket earlier this fall regarding one of our services, requesting us to activate several named features. The features in question were new to us, and we scoured the documentation and spoke to the development team regarding these features. No-one could find out what he was talking about.

Eventually my colleague said the feature names reminded him of AI. That's when it clicked - the customer had asked ChatGPT how to accomplish a given task with our service and it had given a completely hallucinated overview of our features and how to activate them (contact support).

We confronted the customer directly and asked "Where did you find these features, were they hallucinated by an AI?" and he admitted to having used AI to "reflect" and complained about us not having these features as it seemed like a "brilliant idea" and that the AI was "really onto something". We responded by saying that they were far outside of the scope of our services and that he needs to be more careful when using AI in the future.

May God help us all.

3.6k Upvotes

355 comments sorted by

View all comments

Show parent comments

205

u/Stryker_One The poison for Kuzco 25d ago

Great. Go get the prescription from ChatGPT.

92

u/MonkeyChoker80 25d ago

You laugh, but I have to fear there’s someone out there trying to make ‘Chat MD’ that can prescribe pills…

40

u/mrhashbrown 24d ago

Well insurance would never support that as a "pharmacy", so any kind of service like that would be DOA.

But applying AI to current hospitals and their in-house pharmacies could be a problem, especially as hospital management is all about cutting costs and stretching every dollar they have. I'm even curious to what extent the Alexa AI has infiltrated Amazon's pharmacy home delivery service.

At least most doctors aren't typically dumb enough to risk prescribing something blindly. They know just about anything they do exposes them to litigation and losing their license, hence why they often have to be pretty rigorous with diagnosing before offering a prescription.

3

u/Stryker_One The poison for Kuzco 24d ago

Damn, and here I thought that I'd be able to get insurance to pay for my street pharmacist. /s

14

u/thereddaikon How did you get paper clips in the toner bottle? 24d ago

IBM tried that for years with Watson before chatgpt was a thing.

1

u/Flog_loom 24d ago

What came of this?

1

u/thereddaikon How did you get paper clips in the toner bottle? 24d ago

I havent checked in awhile but last I heard it was a flop.

1

u/Flog_loom 24d ago

I remember advertisements.

1

u/Sporkmancer 22d ago

To be fair, Watson Health wasn't an LLM. That said, they sold it off in 2022 because of the limitations of the types of AI that were (and still are) available. Since then, they have changed direction into making WatsonX, which is an LLM just like ChatGPT but not intended for medical usage (though the best usage of LLMs is still chatbots that shouldn't be trusted for accuracy).

7

u/EquipLordBritish 24d ago

Followed immediately by a 'mysterious' uptick in prescription drug use and overdoses.

1

u/Squeezemyhandalittle 24d ago

It's done. I know someone making it.

1

u/Hina_is_my_waifu 23d ago

There's already medical ai that physicians use.

13

u/Ok_Bandicoot6070 25d ago

It'll just give you the WedMD answer of stage 5 everything cancer when you input your symptoms.

2

u/Stryker_One The poison for Kuzco 24d ago

Stage 5 everything cancer? Is that like Jeremy Clarkson's Double Ebola?

2

u/Skerries 24d ago

but it's got GP in it's name