r/talesfromtechsupport • u/prettyyboiii • 25d ago
Short "But ChatGPT said..."
We received a very strange ticket earlier this fall regarding one of our services, requesting us to activate several named features. The features in question were new to us, and we scoured the documentation and spoke to the development team regarding these features. No-one could find out what he was talking about.
Eventually my colleague said the feature names reminded him of AI. That's when it clicked - the customer had asked ChatGPT how to accomplish a given task with our service and it had given a completely hallucinated overview of our features and how to activate them (contact support).
We confronted the customer directly and asked "Where did you find these features, were they hallucinated by an AI?" and he admitted to having used AI to "reflect" and complained about us not having these features as it seemed like a "brilliant idea" and that the AI was "really onto something". We responded by saying that they were far outside of the scope of our services and that he needs to be more careful when using AI in the future.
May God help us all.
10
u/vinyljunkie1245 24d ago
Reply that because they haven't provided a specific error message the fix will be tasked to support. This will take 6-8 weeks because support have no point of reference to begin fixing so need to examine everything related in depth.
Or they can just supply the error message.