codeinabox@programming.dev to Opensource@programming.devEnglish · 4 days agoI don't want your PRs anymoredpc.pwexternal-linkmessage-square9linkfedilinkarrow-up15arrow-down113file-textcross-posted to: [email protected]
arrow-up1-8arrow-down1external-linkI don't want your PRs anymoredpc.pwcodeinabox@programming.dev to Opensource@programming.devEnglish · 4 days agomessage-square9linkfedilinkfile-textcross-posted to: [email protected]
minus-squareamio@lemmy.worldlinkfedilinkarrow-up19arrow-down1·4 days ago“I used to have to check PRs and with LLMs I implicitly trust there’s no malicious shit in them”??? Yeah ok bro
minus-squarecoolie4@lemmy.worldlinkfedilinkarrow-up16·4 days agoI mean… I think they’re right though. LLMs aren’t intentionally malicious. They’re just incompetent.
minus-squareonlinepersona@programming.devlinkfedilinkarrow-up8arrow-down1·4 days agoEnd result is the same.
minus-squareCameronDev@programming.devlinkfedilinkarrow-up1arrow-down1·4 days ago While I still need to review LLM-generated code, I generally don’t have to worry about it being malicious the way an unknown contributor’s code could be.
“I used to have to check PRs and with LLMs I implicitly trust there’s no malicious shit in them”???
Yeah ok bro
I mean… I think they’re right though. LLMs aren’t intentionally malicious. They’re just incompetent.
End result is the same.