Experimental browser for the Atmosphere
We find that the presence of #explanations increases reliance on both correct and incorrect LLM responses. This isn't surprising, given prior HCI/Psychology work on other types of explanations, but raises the question whether LLMs should provide explanations by default 🤔 4/7
Feb 28, 2025, 3:21 PM
{ "uri": "at://did:plc:cnu74q3fooq4q2ty5knusr72/app.bsky.feed.post/3ljapiymtec2v", "cid": "bafyreievllxjuaqmwrmqbxayhunvy3txbtof2vcrjorbxduwjjozt3ufru", "value": { "text": "We find that the presence of #explanations increases reliance on both correct and incorrect LLM responses. \n\nThis isn't surprising, given prior HCI/Psychology work on other types of explanations, but raises the question whether LLMs should provide explanations by default 🤔\n\n4/7", "$type": "app.bsky.feed.post", "langs": [ "en" ], "reply": { "root": { "cid": "bafyreiexptcssgwkbl6feoptytacsp63rhcpiy6elrjgnce6nwwahemoti", "uri": "at://did:plc:cnu74q3fooq4q2ty5knusr72/app.bsky.feed.post/3ljapisfw4s2v" }, "parent": { "cid": "bafyreidxp65b4hd4ka3supod7d6hwh74ca5jdgtjiupwp6gwhxusxhyfkm", "uri": "at://did:plc:cnu74q3fooq4q2ty5knusr72/app.bsky.feed.post/3ljapiwly622v" } }, "facets": [ { "index": { "byteEnd": 42, "byteStart": 29 }, "features": [ { "tag": "explanations", "$type": "app.bsky.richtext.facet#tag" } ] } ], "createdAt": "2025-02-28T15:21:49.094Z" } }