nifty@lemmy.world to Technology@lemmy.worldEnglish · 1 month agoGoogle AI making up recalls that didn’t happenlemmy.worldimagemessage-square154fedilinkarrow-up1141arrow-down12
arrow-up1139arrow-down1imageGoogle AI making up recalls that didn’t happenlemmy.worldnifty@lemmy.world to Technology@lemmy.worldEnglish · 1 month agomessage-square154fedilink
minus-squareShardikprime@lemmy.worldlinkfedilinkEnglisharrow-up0arrow-down1·1 month agoI mean LLMs are not to get exact information. Do people ever read on the stuff they use?
minus-squarePatch@feddit.uklinkfedilinkEnglisharrow-up1·1 month agoThis feels like something you should go tell Google about rather than the rest of us. They’re the ones who have embedded LLM-generated answers to random search queries.
minus-squaremint_tamas@lemmy.worldlinkfedilinkEnglisharrow-up0·1 month agoTheoretically, what would the utility of AI summaries in Google Search if not getting exact information?
minus-squareMalfeasant@lemmy.worldlinkfedilinkEnglisharrow-up1·1 month agoSteering your eyes toward ads, of course, what a silly question.
I mean LLMs are not to get exact information. Do people ever read on the stuff they use?
This feels like something you should go tell Google about rather than the rest of us. They’re the ones who have embedded LLM-generated answers to random search queries.
Theoretically, what would the utility of AI summaries in Google Search if not getting exact information?
Steering your eyes toward ads, of course, what a silly question.