Experimental browser for the Atmosphere
Loading post...
{ "uri": "at://did:plc:zu7x2erjllke75nfzs6xxhiz/app.bsky.feed.like/3lok7jzvmxa2y", "cid": "bafyreihx36jvppifiere5jjyddwo7xgdanimzdt3pioekjtd7exjkjxqvy", "value": { "$type": "app.bsky.feed.like", "subject": { "cid": "bafyreigzhppna7vgi4cvahfolhm7mpxgm2gzvoftiv4dmhylag7tbxbqeu", "uri": "at://did:plc:o7xt7svg2xtjbb4e2xqahqqc/app.bsky.feed.post/3lojyoq3wqk2u" }, "createdAt": "2025-05-07T01:08:45.518Z" } }
What chatbots do - and I'm not saying this as a hater, this is not a bug that can be fixed, it's the actual underlying technology - is generate the likeliest text that would follow after the prompt it was given. That's it. The prompt isn't *instructions.* It's just undifferentiated input data.
May 6, 2025, 11:06 PM