Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

+1, interesting findings! I like how it was able to infer the meaning from such a short phrase in a limited context.


It's actually a very common phrase on forums, I think because it's an actual error that Linux will report: https://askubuntu.com/questions/868321/gpu-has-fallen-off-th.... I've also never heard of it, but it seems like it must appear a lot in the training data and probably about 0 times is referring to a bus on the road.


In my testing, both Llama 3 and its abliterated (uncensored) variant from[0] almost always remarked more or less directly that they see the joke in the phrase, so either they've seen the other meaning in training, or inferred it.

--

[0] - https://news.ycombinator.com/item?id=40665721


Oh I agree it probably inferred the joke. I was actually more surprised that it knew the real meaning of the phrase because I as a human did not, until I looked it up and saw how common it is.


Please use the word ablated instead. That article's title is not using a real word. I'm assuming it's the author's English issue, since they called the model "helpfull" instead of "helpful".


Oops. I actually originally wrote "ablated", then changed it to be consistent with the title.


To be specific, the system prompt used was (default in LM Studio config for Llama 3 V2):

You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.

And then the query was:

GPU falling off the bus

And yes, I imagine it read that query as ending with an implied "pls help!".




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: