Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have noticed this as well,

GPT "corrected" a bug which wasn't actually a bug, and wrote some alternative code.

After a bit of back-and-forth, I convinced GPT that the original code did not have a bug.

GPT then told me that that was true, but its correction was better anyway, for a different reason, to which I was forced to agree. Funny behaviour.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: