I have the m3 max with 32GB and the performance is not fanta...

Logen

npub1vxlhjzeqjjhmqdy4e8sndt8kzklqlnxzew2mtt8mtakvalsckp3qa0gnvx

hex

f6c8b5f86e35f93241cccc336daf89a7327f50e78b35fe482398ca6df5050215

nevent

nevent1qqs0dj94lphrt7fjg8xvcvmd47y6wvnl2rnckd07fq3e3jnd75zsy9gprpmhxue69uhhyetvv9ujuem4d36kwatvw5hx6mm9qgsxr0mepvsfftasxj2uncfk4nmpt0s0enpvh9d44na47mxwlcvtqcs9nllnr

Kind-1 (TextNote)

2026-03-04T13:50:03Z

↳ 回复 1a5afb99... (npub1rfd0hxdzcze6pzj29thuz34vur57wm9quje7w3edxjgusq6m47csnl7wrt)

Anyone using local LLMs for coding? For those that do, any recommendations on which models to use? I’m considering a big upgrade to my MBP (2020 intel...

I have the m3 max with 32GB and the performance is not fantastic for local models. It’s pretty slow, hallucinate a lot, and just makes my laptop really really hot for not a lot of benefit. I still just end up resorting to antigravity and either the Claude code extension or the built-in agent.

原始 JSON

{
  "kind": 1,
  "id": "f6c8b5f86e35f93241cccc336daf89a7327f50e78b35fe482398ca6df5050215",
  "pubkey": "61bf790b2094afb03495c9e136acf615be0fccc2cb95b5acfb5f6ccefe18b062",
  "created_at": 1772632203,
  "tags": [
    [
      "e",
      "76f6e4526ffd6c8193e786c0602dbbcde851cb62f08c1990560537766fe0fae6",
      "",
      "root"
    ],
    [
      "p",
      "1a5afb99a2c0b3a08a4a2aefc146ace0e9e76ca0e4b3e7472d3491c8035bafb1"
    ],
    [
      "client",
      "Nostur",
      "31990:9be0be0fc079548233231614e4e1efc9f28b0db398011efeecf05fe570e5dd33:1685868693432"
    ]
  ],
  "content": "I have the m3 max with 32GB and the performance is not fantastic for local models. It’s pretty slow, hallucinate a lot, and just makes my laptop really really hot for not a lot of benefit. I still just end up resorting to antigravity and either the Claude code extension or the built-in agent.",
  "sig": "89f74f1127cf81034c4b03b280c716e69e34cd24592de5d8f73e792bb14dec79e288c8d11bb2e3c72ae0c5b694c22e6b415250a98b331073114a72a4c7111ae6"
}