Experimental browser for the Atmosphere
Loading post...
{ "uri": "at://did:plc:2osm6ol3lmsq3lbrhiyjfpu7/app.bsky.feed.like/3lov2dcaisc2c", "cid": "bafyreieoya4li6e6ugbz35jcw5x5aykzr4xrvttokfjshclcptnkn62jqu", "value": { "$type": "app.bsky.feed.like", "subject": { "cid": "bafyreiekkfg5cv4rxo4vlweytk54o4jsqolkxvl4lgc4nttofvznfqgq7u", "uri": "at://did:plc:ybkylffhwhn2an2ic2lxh76k/app.bsky.feed.post/3louunglyts2u" }, "createdAt": "2025-05-11T08:34:48.122Z" } }
Again, @ft.com reporters or whoever else needs to hear this: "Hallucinations" are not the result of "flaws," they are literally inherent in & inextricable from what LLM systems do & are. Whether an "AI" tells you something that matches reality or something that doesn't, *it is working as designed*
May 11, 2025, 6:53 AM