Experimental browser for the Atmosphere
In what sense are "powerful" AI systems that more often than not are simply WRONG "systems" at all? These are hallucination machines. Why are their creators even releasing them? What are people doing with them? www.nytimes.com/2025/05/05/t...
May 5, 2025, 12:18 PM
{ "uri": "at://did:plc:3pvthwleviachr4hm7qw7ini/app.bsky.feed.post/3logdzdz5hc2s", "cid": "bafyreia6hpj2j2ax4ytxhqpeky36jf7a5kawilyfyb3nll3er24kgntcxa", "value": { "text": "In what sense are \"powerful\" AI systems that more often than not are simply WRONG \"systems\" at all? These are hallucination machines. Why are their creators even releasing them? What are people doing with them? www.nytimes.com/2025/05/05/t...", "$type": "app.bsky.feed.post", "embed": { "$type": "app.bsky.embed.external", "external": { "uri": "https://www.nytimes.com/2025/05/05/technology/ai-hallucinations-chatgpt-google.html", "thumb": { "$type": "blob", "ref": { "$link": "bafkreia7z4dv2t7dpqhskrv7wiojin2qgziobbory2jpn3ykamahgzzmsa" }, "mimeType": "image/jpeg", "size": 293989 }, "title": "A.I. Hallucinations Are Getting Worse, Even as New Systems Become More Powerful", "description": "A new wave of “reasoning” systems from companies like OpenAI is producing incorrect information more often. Even the companies don’t know why." } }, "langs": [ "en" ], "facets": [ { "index": { "byteEnd": 242, "byteStart": 211 }, "features": [ { "uri": "https://www.nytimes.com/2025/05/05/technology/ai-hallucinations-chatgpt-google.html", "$type": "app.bsky.richtext.facet#link" } ] } ], "createdAt": "2025-05-05T12:18:16.089Z" } }