Experimental browser for the Atmosphere
Exactly, if "hallucinate" generally means perceiving something that isn't there, then what AI is doing is almost like the opposite: it's finding something that IS there and wrongly retrieving it because it has no ability to interpret what it finds. But it inherently can't find what isn't there.
May 2, 2025, 7:10 PM
{ "uri": "at://did:plc:35e55fr5d4ornxdcvjbzvkmx/app.bsky.feed.post/3lo7jntp5q222", "cid": "bafyreievnxj3egce6ols47g4ctgykx7v5j3woaqfnvk3tf2vylzcmhohc4", "value": { "text": "Exactly, if \"hallucinate\" generally means perceiving something that isn't there, then what AI is doing is almost like the opposite: it's finding something that IS there and wrongly retrieving it because it has no ability to interpret what it finds. But it inherently can't find what isn't there.", "$type": "app.bsky.feed.post", "langs": [ "en" ], "reply": { "root": { "cid": "bafyreif2gmfecgnkam5agcitye7ypyz6kaf34jsptyu5k3bpgahvlqddry", "uri": "at://did:plc:s5fi6htc7xhnshxxn37dg65c/app.bsky.feed.post/3lo7bmt3nq22f" }, "parent": { "cid": "bafyreifgnnd2ggvu3q6hgesisdq525ybmw247uwr5op2jt6zozswx2y7fq", "uri": "at://did:plc:s5fi6htc7xhnshxxn37dg65c/app.bsky.feed.post/3lo7bpvz6m22f" } }, "createdAt": "2025-05-02T19:10:34.168Z" } }