Experimental browser for the Atmosphere
Loading post...
{ "uri": "at://did:plc:t5t2aacwcqk5lkhly6ebinzl/app.bsky.feed.like/3loobdry3ct2e", "cid": "bafyreihuez4gpcmuvt2jhaq2jre26pb3h4os5frlbyn3ekxjbwczsbh554", "value": { "$type": "app.bsky.feed.like", "subject": { "cid": "bafyreid5xxpiyvqcf552ukraynec5o75fvvxq6bh2jmwqzouvt7d7b76wq", "uri": "at://did:plc:vbufq3xwt3233giwk4ulgvpr/app.bsky.feed.post/3lonqtxh43s2a" }, "createdAt": "2025-05-08T15:51:42.821Z" } }
LLMs also don’t really have an ideology, they produce a statistical range of words based on what they were trained on and what you ask them. Which makes predicting what they’ll encourage insanely hard.
May 8, 2025, 10:56 AM