Experimental browser for the Atmosphere
Loading post...
{ "uri": "at://did:plc:fgr4mvuihvox4x2is7qmsrc3/app.bsky.feed.like/3lovowi4x7f2p", "cid": "bafyreigmegps2zsvxqlupva34nrgu5zypjntfaa36aeao37oh4magrrgw4", "value": { "$type": "app.bsky.feed.like", "subject": { "cid": "bafyreiekkfg5cv4rxo4vlweytk54o4jsqolkxvl4lgc4nttofvznfqgq7u", "uri": "at://did:plc:ybkylffhwhn2an2ic2lxh76k/app.bsky.feed.post/3louunglyts2u" }, "createdAt": "2025-05-11T14:43:26.904Z" } }
Again, @ft.com reporters or whoever else needs to hear this: "Hallucinations" are not the result of "flaws," they are literally inherent in & inextricable from what LLM systems do & are. Whether an "AI" tells you something that matches reality or something that doesn't, *it is working as designed*
May 11, 2025, 6:53 AM