Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's a test designed to cause cognitive dissonance. The LLM assumes a human has a logical reason to to walk to the car wash. The prompt never says the car isn't already at the car wash (and that the user has a second car). The issue isn't that LLMs can't solve a simple logic problem. It's that it assumes people aren't idiots.
 help



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: