Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So a prediction machine chose a particular predicted path, and then came up with phrases to ameliorate it and you're swooning? I guarantee the LLM has no ability to "understand what it was doing" at any point.


Forgive me, I left my opinion open to interpretation: I am mocking the claim that this technology has anything resembling human intelligence.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: