tired_n_bored to Fuck AI · 7 天前Hey we solved software development, no need to learn programming anymore (Claude's source code leak)imagemessage-square41linkfedilinkarrow-up1457arrow-down15
arrow-up1452arrow-down1imageHey we solved software development, no need to learn programming anymore (Claude's source code leak)tired_n_bored to Fuck AI · 7 天前message-square41linkfedilink
minus-squareMadrigallinkfedilinkEnglisharrow-up88·7 天前I’ve literally seen someone include “Don’t hallucinate” in an agent’s instructions.
minus-squarerozodru@piefed.worldlinkfedilinkEnglisharrow-up40·7 天前Asking Claude to not hallucinate is like telling a person to not breathe. it’s gonna happen, and happen conistently.
minus-squareFrederikNJS@piefed.ziplinkfedilinkEnglisharrow-up52·7 天前I think the important bit to understand here is that LLMs are never not hallucinating. But they sometimes happens to hallucinate something correct.
minus-squareJames R Kirk@startrek.websitelinkfedilinkEnglisharrow-up35·7 天前This fact of how LLMs work is not at all widespread enough IMO.
I’ve literally seen someone include “Don’t hallucinate” in an agent’s instructions.
Asking Claude to not hallucinate is like telling a person to not breathe. it’s gonna happen, and happen conistently.
I think the important bit to understand here is that LLMs are never not hallucinating. But they sometimes happens to hallucinate something correct.
This fact of how LLMs work is not at all widespread enough IMO.