2
1
0
I'm more and more confident that tokenization will be gone.
Humans don't think in "tokens".
Tokens are hardcoded abstractions in LLMs that lead to weird behavior: LLMs can solve PhD-level math questions but cannot answer "Is 9.9 > 9.11?"
Meta is shifting LLMs to LCMs (Large
时政
(
twitter.com
)
由
Yuchen Jin
提交
Markdown支持
评论加载中...
您可能感兴趣的:
更多
4
2
1
1
7
3
2
2
10
3
2
2