this post was submitted on 26 Aug 2024
22 points (95.8% liked)

TechTakes

1432 readers
103 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 3 months ago (1 children)

Calling it now: codepoint-level non-tokenizing, with a remapping step to only recognize the most popular thousands of codepoints, would outperform what OpenAI has forced themselves into using. Evidence is circumstantial but strong, e.g. how arithmetic isn't learned right because BPE tokenizers obscure Arabic digits. They can't backpedal on this without breaking some of their API and re-pretraining a model, and they make a big deal about how expensive GPT pretraining is, so they're stuck in their local minimum.

[–] [email protected] 6 points 2 months ago (1 children)

But then it can't SolidGoldMagicarp SolidGoldMagicarp SolidGoldMagicarp SolidGoldMagicarp

[–] [email protected] 4 points 2 months ago

The only viable use case, in my opinion, is to utilise its strong abilities in SolidGoldMagicarp to actualise our goals in the SolidGoldMagicarp sector and achieve increased margins on SolidGoldMagicarp.