女大危矣!

來源: 2025-11-04 12:48:44 [舊帖] [給我悄悄話] 本文已被閱讀:

清華和騰訊碼工的AI新作CALM:直接顛覆了現有LLM

Robert Youssef

 

@rryssf_

 

Holy shit... this might be the next big paradigm shift in AI.

 

 

 

Tencent + Tsinghua just dropped a paper called Continuous Autoregressive Language Models (CALM) and it basically kills the “next-token” paradigm every LLM is built on.

 

 

 

Instead of predicting one token at a time, CALM predicts continuous vectors that represent multiple tokens at once.

 

 

 

Meaning: the model doesn’t think “word by word”… it thinks in ideas per step.

 

 

 

Here’s why that’s insane

 

 

 

→ 4× fewer prediction steps (each vector = ~4 tokens)

 

→ 44% less training compute

 

→ No discrete vocabulary pure continuous reasoning

 

→ New metric (BrierLM) replaces perplexity entirely

 

 

 

They even built a new energy-based transformer that learns without softmax no token sampling, no vocab ceiling.

 

 

 

It’s like going from speaking Morse code… to streaming full thoughts.

 

 

 

If this scales, every LLM today is obsolete.