nifty@lemmy.world to Technology@lemmy.worldEnglish · 5 months agoGoogle AI making up recalls that didn’t happenlemmy.worldimagemessage-square218fedilinkarrow-up11.64Karrow-down123
arrow-up11.62Karrow-down1imageGoogle AI making up recalls that didn’t happenlemmy.worldnifty@lemmy.world to Technology@lemmy.worldEnglish · 5 months agomessage-square218fedilink
minus-squareShardikprime@lemmy.worldlinkfedilinkEnglisharrow-up3arrow-down16·5 months agoI mean LLMs are not to get exact information. Do people ever read on the stuff they use?
minus-squaremint_tamas@lemmy.worldlinkfedilinkEnglisharrow-up15·5 months agoTheoretically, what would the utility of AI summaries in Google Search if not getting exact information?
minus-squareMalfeasant@lemmy.worldlinkfedilinkEnglisharrow-up3arrow-down1·5 months agoSteering your eyes toward ads, of course, what a silly question.
minus-squarePatch@feddit.uklinkfedilinkEnglisharrow-up11·5 months agoThis feels like something you should go tell Google about rather than the rest of us. They’re the ones who have embedded LLM-generated answers to random search queries.
I mean LLMs are not to get exact information. Do people ever read on the stuff they use?
Theoretically, what would the utility of AI summaries in Google Search if not getting exact information?
Steering your eyes toward ads, of course, what a silly question.
This feels like something you should go tell Google about rather than the rest of us. They’re the ones who have embedded LLM-generated answers to random search queries.