That's precisely my point. The same word talking about the s...

c4368c512e70e365...

npub1csmgc5fwwr3k2k86zeuk0ntnljp632g8agut8mtgxk8uhhatpknq3qcakv

hex

b6e41ef7cdfa09b6b9c5a25fa68f8fa4d1b13949983936b045280cac8743d3c9

nevent

nevent1qqstdeq77lxl5zdkh8z6yhax3786f5d389yeswfkkpzjsr9vsapa8jgprpmhxue69uhhyetvv9ujuem4d36kwatvw5hx6mm9qgsvgd5v2yh8pcm9trapv7t8e4eleqag4yr75w9na45rtr7tm74smfstyy6wq

Kind-1 (TextNote)

2026-03-18T14:23:52Z

↳ 回复 事件不存在

8967208c60ef776330cbbd39583c2eb9e8de9d4308529725d68ea10e65aa80ee...

That's precisely my point. The same word talking about the same thing but with the opposite meaning. One meaning positive information (Shannon). The other mean negative information (Boltzman). Structure and the lack of structure. Both use the same equation to describe unpredictability, but doing in opposite directions, so Shannon reused the term.

Shannon should have called it negentropy (which is the name of the data sync algo used between Nostr relays).

It's the boundary between potentia and actus being viewed from opposite directions. The same word describing opposite directions on the same axis. One points in the surprise, differentiation direction. The other points in the uniformity, equilibrium direction.

To be fair, Shannon struggled with the naming. John von Neumann supposedly told him to use "entropy" because it's the same function and "no one really knows what entropy really is, so in a debate you will always have the advantage"

原始 JSON

{
  "kind": 1,
  "id": "b6e41ef7cdfa09b6b9c5a25fa68f8fa4d1b13949983936b045280cac8743d3c9",
  "pubkey": "c4368c512e70e36558fa167967cd73fc83a8a907ea38b3ed68358fcbdfab0da6",
  "created_at": 1773843832,
  "tags": [
    [
      "p",
      "c4368c512e70e36558fa167967cd73fc83a8a907ea38b3ed68358fcbdfab0da6",
      "",
      "mention"
    ],
    [
      "p",
      "3c4f51561243524f307ed2ee272c7cf4a782404fbe3a176606043b6ad427ee77"
    ],
    [
      "e",
      "8967208c60ef776330cbbd39583c2eb9e8de9d4308529725d68ea10e65aa80ee",
      "",
      "root",
      "3c4f51561243524f307ed2ee272c7cf4a782404fbe3a176606043b6ad427ee77"
    ]
  ],
  "content": "That's precisely my point. The same word talking about the same thing but with the opposite meaning. One meaning positive information (Shannon). The other mean negative information (Boltzman). Structure and the lack of structure. Both use the same equation to describe unpredictability, but doing in opposite directions, so Shannon reused the term.\n\nShannon should have called it negentropy (which is the name of the data sync algo used between Nostr relays).\n\nIt's the boundary between potentia and actus being viewed from opposite directions. The same word describing opposite directions on the same axis. One points in the surprise, differentiation direction. The other points in the uniformity, equilibrium direction.\n\nTo be fair, Shannon struggled with the naming. John von Neumann supposedly told him to use \"entropy\" because it's the same function and \"no one really knows what entropy really is, so in a debate you will always have the advantage\" \n\n",
  "sig": "a04f21c9abe204f57cf3afd6e75328d152f8962ffc443ee377b9874ddfe518468c429bb703fe38d8c9bd793293cc957a1622fbd669e90a7e8600442f3379132a"
}