Experimental browser for the Atmosphere
ChatGPT Deep Research does indeed hallucinate when given a false premise (or when asked to do research on the ungrounded facts ChatGPT hallucinates). This is what I call "induced hallucination":
Apr 14, 2025, 4:39 AM
{ "uri": "at://did:plc:baglxmt2hfmzfaddt7a2xj27/app.bsky.feed.post/3lmqqmdht2i2l", "cid": "bafyreic52vgqh54c4kqkyvhgqjqekorurpxlw5kjjmg7dmbenztiqplr44", "value": { "tags": [], "text": "ChatGPT Deep Research does indeed hallucinate when given a false premise (or when asked to do research on the ungrounded facts ChatGPT hallucinates). This is what I call \"induced hallucination\":", "$type": "app.bsky.feed.post", "embed": { "$type": "app.bsky.embed.images", "images": [ { "alt": "", "image": { "$type": "blob", "ref": { "$link": "bafkreihdhtvhhou4qsw6jcmhprd3duny2e2ujcqbggvx7ycnuptuerpalm" }, "mimeType": "image/png", "size": 61939 }, "aspectRatio": { "width": 1221, "height": 476 } }, { "alt": "", "image": { "$type": "blob", "ref": { "$link": "bafkreideh4ijld5t5nuphmfwqja3wumrprbqcrgvokldl7fvap3azrmjpu" }, "mimeType": "image/png", "size": 138624 }, "aspectRatio": { "width": 1258, "height": 728 } } ] }, "langs": [ "en" ], "facets": [], "createdAt": "2025-04-14T04:39:52.587Z" } }