Great writeup, I enjoyed it. Here are some unfiltered, light...

Jay

npub10mtatsat7ph6rsq0w8u8npt8d86x4jfr2nqjnvld2439q6f8ugqq0x27hf

hex

49f9d56b73e1bef9693d829680a5d70dd45f54c85c8e8fc6803678603b960179

nevent

nevent1qqsyn7w4dde7r0hedy7c995q5htsm4zl2ny9er50c6qrv7rq8wtqz7gprpmhxue69uhhyetvv9ujuem4d36kwatvw5hx6mm9qgs8a474cw4lqmapcq8hr7res4nknar2ey34fsffk0k42cjsdyn7yqqs8hj0l

Kind-1 (TextNote)

2026-05-03T10:47:25Z

Great writeup, I enjoyed it. Here are some unfiltered, lightly moderated thoughts.

  • we have a wild forest of whatever moderation people can manage with the tools they have, and while it works to a degree, it's not enough in the long run.

  • clients running on user devices are limited by the hardware they run on. The client device should only process the data needed to serve one user. However to run moderation tools and algorithms, you need to work on population sets of data. This task fits into server contexts, not consumer hardware.

  • I don't know what it is about Nostr culture that is allergic to server-side code and policy. But users who run their own relays gain a lot of power in doing so. Making powerful relays and auxiliary services that are self-hostable that can enable user-controlled moderation is a promising path.

  • Self-hosted services need not only serve the one running them. Everyone you interact with can benefit from your software's content policies. And the more individual content policies there are, the more real choice and unilateral exit people actually have without having to resort to self-hosting themselves.

  • current relays moderate on protocol level data: pubkeys, kinds, content length, event size, etc. To a normal person, this isn't moderation. In that sense, most self-hosted relay software do not provide meaningful moderation tools. Choosing relays is an impossible task because most of them are essentially unfiltered database instances.

  • I also have my own opinions of what a good solution to this problem is. I'm focused on the personal relay side of things and making an aggregator that uses your own activity to generate its whitelist. Instead of trying to half ass a service for many, I want to whole ass a service for myself and the people I interact with most. But if many such focused relays popped up over the network, the result would be robust, parallellized moderation where the policies are owned by the people who most benefit from them.

Raw JSON

{
  "kind": 1,
  "id": "49f9d56b73e1bef9693d829680a5d70dd45f54c85c8e8fc6803678603b960179",
  "pubkey": "7ed7d5c3abf06fa1c00f71f879856769f46ac92354c129b3ed5562506927e200",
  "created_at": 1777805245,
  "tags": [
    [
      "p",
      "52b4a076bcbbbdc3a1aefa3735816cf74993b1b8db202b01c883c58be7fad8bd"
    ],
    [
      "a",
      "30023:52b4a076bcbbbdc3a1aefa3735816cf74993b1b8db202b01c883c58be7fad8bd:ab008d4c6e90",
      "",
      "root"
    ]
  ],
  "content": "Great writeup, I enjoyed it. Here are some unfiltered, lightly moderated thoughts.\n\n- we have a wild forest of whatever moderation people can manage with the tools they have, and while it works to a degree, it's not enough in the long run.\n\n- clients running on user devices are limited by the hardware they run on. The client device should only process the data needed to serve one user. However to run moderation tools and algorithms, you need to work on population sets of data. This task fits into server contexts, not consumer hardware.\n\n- I don't know what it is about Nostr culture that is allergic to server-side code and policy. But users who run their own relays gain a lot of power in doing so. Making powerful relays and auxiliary services that are self-hostable that can enable user-controlled moderation is a promising path.\n\n- Self-hosted services need not only serve the one running them. Everyone you interact with can benefit from your software's content policies. And the more individual content policies there are, the more real choice and unilateral exit people actually have without having to resort to self-hosting themselves.\n\n- current relays moderate on protocol level data: pubkeys, kinds, content length, event size, etc. To a normal person, this isn't moderation. In that sense, most self-hosted relay software do not provide meaningful moderation tools. Choosing relays is an impossible task because most of them are essentially unfiltered database instances.\n\n- I also have my own opinions of what a good solution to this problem is. I'm focused on the personal relay side of things and making an aggregator that uses your own activity to generate its whitelist. Instead of trying to half ass a service for many, I want to whole ass a service for myself and the people I interact with most. But if many such focused relays popped up over the network, the result would be robust, parallellized moderation where the policies are owned by the people who most benefit from them.",
  "sig": "98558a0b1967b4052a92a1050d76926406c577c75b892a2d403f0e349e5bba5b6276fff3d1fb511a56b0ca9a0e27482a518161cf6e32d190d5472f9f705acbb3"
}