> Rather, the problem is, once you do define it, you will quickly find that LLMs are capable of it.
That’s really not what’s happening though. People who claim LLMs can do X and Y often don’t even understand how LLMs work. The opposite is also true. They just open a prompt and get an output and shout Eureka. Of course not everyone is like this, but majority are. It’s similar to what we think about thinking itself. You read these comments and everyone is an expert on human brain and how it works. It’s fascinating.
Notice how nearly every comment is just dancing around the issue and nitpicking instead of just owning up to whether or not they think LLMs are capable of reasoning.
I wish that was the case. I see most people, who believe to be experts, already made up their minds and being pretty sure about it.
I personally believe that not being sure about something, esp. on a topic as complicated and as this one, remaining open to different possibilities, and having a bit of skepticism, is healthy.
That’s really not what’s happening though. People who claim LLMs can do X and Y often don’t even understand how LLMs work. The opposite is also true. They just open a prompt and get an output and shout Eureka. Of course not everyone is like this, but majority are. It’s similar to what we think about thinking itself. You read these comments and everyone is an expert on human brain and how it works. It’s fascinating.