Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ChatGPT doesn’t just misidentify the guy - it entirely fabricates a crime and attaches it to a suspect.

I tried to get ChatGPT to summarize a music video today. (Sugar’s “If I Can’t Change Your Mind”)

I expected it might give a bland summary or something, but that’s not what happened.

It invented entire scenarios that weren’t in the video at all, and invented lyrics not in the song.

That’s pretty harmless, but I can easily see ChatGPT inventing some awful story about a person and that being carried over to a publication and gaining a life of its own.

Basically it’s a souped up urban legend generator, but it’s being offered as a tool to provide search results and content. It’s not just an unreliable narrator - it’s an unreliable narrator being offered as an expert witness.



You're starting off with a distinction without difference.

You could throw darts at a spinning wheel with real names and imagined crimes.

The point is that it doesn't matter what the seed for the false statement is, it's the act of spreading it that's problematic.

You're also muddling a point that I can agree with: Treating ChatGPT as an infallible expert is wrong.

But that applies to so many other things. Even expert witnesses are not infallible.

So I disagree with characterizing hallucinations as the problem, it's the application that's problematic.

Blindly and pasting factual content from ChatGPT is a bad idea, just like blindly taking a single source of information as gospel is a bad idea.

Humans can be just as confidently wrong as LLMs, and a simple adage applies to both: trust but verify.


> Humans can be just as confidently wrong as LLMs, and a simple adage applies to both: trust but verify.

Trust people who have earned trust (either through qualifications or reputation) and treat everyone else as good faith actors who are quite possibly wrong.

ChatGPT should be treated as a person you just met at a bus stop who is well dressed and well spoken but has just told you that you are both waiting for the elevator to arrive at the bus stop.


That's the fast track to get your point of view to be ignored: Pessimism is ok, but that level of dismissiveness isn't really warranted: especially since the conversation forming in public is not just about some specific model you happen have strong feelings about, but the general concept of LLMs and factuality.

I wouldn't expect a random doctor approached at a bus stop to accurately answer a question about medicine anymore than I would ChatGPT by the way. Trusting people based on their qualifications and reputations isn't really a thing.

If a doctor tells you to take medication X there's a reason you take that to a pharmacist rather than a store clerk with a key to a safe or something: verifying is always a great idea, regardless of reputation.


I'm not sure how the critique relates to my post. Of course you wouldn't trust an architect with medical advice or a doctor with structural materials for bridge building; that was implied.


> ChatGPT should be treated as a person you just met at a bus stop who is well dressed and well spoken but has just told you that you are both waiting for the elevator to arrive at the bus stop.

Ahahahahaha. Wow. This is brilliant mate. I'm going to start using it.


Every time I see people on HN say how they love to use chatgpt to generate ideas I think of this. It seems a lot more work coming up with prompts then having to vet them to see if the output is even sensible or not than it does to come up with some sensible search terms to query real data that actually contains what you are looking for.


Using LLMs for factual concepts is by far the most boring application of them.

Often times people use LLMs for generating ideas that don't have a factual basis.

ChatGPT will happily invent gameplay mechanics that are fun. It will generate prompts for convincing concept art for something you haven't built.

If what you're looking for can be answered by Google, sure the business people at Microsoft would rather you use a portal that never lets you leave their site... but that's not interesting.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: