Experimental browser for the Atmosphere
LLMs can't conceal information... not only does that imply intent, but a probabilistic token ordering program doesn't "know" facts in the first place. Being able to model "this is what a statement of fact looks like" doesn't require being able to evaliate the truth value of a statement.
May 1, 2025, 7:32 AM
{ "uri": "at://did:plc:3hkhsfgkd6i2as77v37c3fle/app.bsky.feed.post/3lo3s6aclu22i", "cid": "bafyreieplzvphxdbebicntjj6kqwfub4ju64wea6riobkmmzpfwofwcjyq", "value": { "text": "LLMs can't conceal information... not only does that imply intent, but a probabilistic token ordering program doesn't \"know\" facts in the first place. Being able to model \"this is what a statement of fact looks like\" doesn't require being able to evaliate the truth value of a statement.", "$type": "app.bsky.feed.post", "langs": [ "en" ], "reply": { "root": { "cid": "bafyreigawlvxccdvadvzk3fybsrfw3llzfomcqpfu7oeljmsyw7ql3tyrq", "uri": "at://did:plc:cak4klqoj3bqgk5rj6b4f5do/app.bsky.feed.post/3lo3iijwais2b" }, "parent": { "cid": "bafyreihva5xziwlpf2brtundki64nfghueyl35fgek4b3vaa3ocxpibcau", "uri": "at://did:plc:mh776dqe35ssi2lbrk4sdvu3/app.bsky.feed.post/3lo3jlx4cak2j" } }, "createdAt": "2025-05-01T07:32:15.235Z" } }