Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Because it’s a distinction without a difference. You can say the same thing about people: many/most of our decisions are made before our consciousness is involved. Much of our “decision making” is just post hoc rationalization.

What the “LLMs don’t reason like we humans” crowd is missing is that we humans actually don’t reason as much as we would like to believe[0].

It’s not that LLMs are perfect or rational or flawless… it’s that their gaps in these areas aren’t atypical for humans. Saying “but they don’t truly understand things like we do” betrays a lack of understanding of humans, not LLMs.

0. https://home.csulb.edu/~cwallis/382/readings/482/nisbett%20s...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: