misk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square233fedilinkarrow-up1907arrow-down119
arrow-up1888arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 11 months agomessage-square233fedilink
minus-squareCarlsIII@kbin.sociallinkfedilinkarrow-up2arrow-down1·edit-211 months agoThe headline doesn’t mention that someone found a way for it to output its training data, which seems like the bigger story
minus-squareCrayonRosary@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-211 months agoThat was yesterday’s news. The article is assuming you already knew that. This is just an update saying that attempting the “hack” is a violation of terms.
minus-squareCrayonRosary@lemmy.worldlinkfedilinkEnglisharrow-up1·11 months agoBut the article did contain that information, so I don’t know what you’re talking about.
minus-squareCrayonRosary@lemmy.worldlinkfedilinkEnglisharrow-up1·11 months agoBut your last comment literally said, “Bad article then”.
How so?
The headline doesn’t mention that someone found a way for it to output its training data, which seems like the bigger story
That was yesterday’s news. The article is assuming you already knew that. This is just an update saying that attempting the “hack” is a violation of terms.
Bad article then
But the article did contain that information, so I don’t know what you’re talking about.
deleted by creator
But your last comment literally said, “Bad article then”.