Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I agree. If the LLM were truly an intelligence, it would be able to ask about this nonsense question. It would be able to ask "Why is walking even an option? Can you please explain how you imagine that would work? Do you mean hand-washing the car at home, instead?" (etc, etc)

Real people can ask for clarification when things are ambiguous or confusing. Once something is clarified, they can work that into their understanding of how someone communicates about a given topic. An LLM can't.

 help



LLMs like the ones from Claude can ask questions and even have you pick from multiple choices or provide your own answer…

Gemini's responses come very close to doing that when they make fun of the question (see other posts in the thread). If the model had been RL'ed to ask follow-up questions, it seems likely that it would meet your criterion.

And the corollary: if LLMs were truly intelligent, they would also be able to respond to such questions sarcastically.

Which Gemini does...?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: