Experimental browser for the Atmosphere
Loading post...
{ "uri": "at://did:plc:6f3ycjiusbekybdsptant5yu/app.bsky.feed.like/3lewzi2xpgw2c", "cid": "bafyreierifnek63u7mjr5csio6fln33ehrhqaepg4twf7rwmq7uyr66iai", "value": { "$type": "app.bsky.feed.like", "subject": { "cid": "bafyreihdvqesqcelc5oed7szlxupeiin4572mjxlmqxumawwjyn47bvowq", "uri": "at://did:plc:ttucwdyvfigxclycfsp67fcu/app.bsky.feed.post/3lewyfloyts24" }, "createdAt": "2025-01-04T21:12:56.699Z" } }
LLMs are actually stateless. You put in the historic context as input. Chat is only a UI constraint to build this context iteratively.
Jan 4, 2025, 8:53 PM