Experimental browser for the Atmosphere
I think people not understanding the technology is kinda the problem with LLMs, because terms like hallucination just add to the misunderstanding and play into the marketing of LLMs as somehow AI instead of just, like, language models that produce nonsense that sounds like language.
May 15, 2025, 7:37 AM
{ "uri": "at://did:plc:gfa23j7m2bkw4xr5u65aavoj/app.bsky.feed.post/3lp6yxpybwk2g", "cid": "bafyreibdvcz5uo23z4pvqh2hgs7eooqjbyazsyfexeoyg3mqd7zpgslntm", "value": { "text": "I think people not understanding the technology is kinda the problem with LLMs, because terms like hallucination just add to the misunderstanding and play into the marketing of LLMs as somehow AI instead of just, like, language models that produce nonsense that sounds like language.", "$type": "app.bsky.feed.post", "langs": [ "en" ], "reply": { "root": { "cid": "bafyreieiajwnmfaf6f3j735fytvhbeejdgfyii7bejljd4vm6v3rfhz2vy", "uri": "at://did:plc:dpsslfnv3cr4c3afuwv33wwy/app.bsky.feed.post/3lp6xxntz6s2a" }, "parent": { "cid": "bafyreigpace3ytyv6mcbwua43yyxd4ymd2c3cvmos7iyg2jpz2tdqafkoe", "uri": "at://did:plc:dpsslfnv3cr4c3afuwv33wwy/app.bsky.feed.post/3lp6yqrwpys2a" } }, "createdAt": "2025-05-15T07:37:03.834Z" } }