Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's not a person, it's a machine. And it's one that will still produce hallucinations that embarrassingly prove that it has no notion of intelligence, and do so confidently. That it does so less than it's sibling is entirely irrelevant.


Nope, wrong. The number of errors and the magnitude of each is very relevant.


Black box hallucination engines do what they do https://news.ycombinator.com/item?id=36134249




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: