又問了同樣的問題,得到了一個比較中立的答案:

來源: 2025-01-30 11:48:09 [舊帖] [給我悄悄話] 本文已被閱讀:

問:Is deepseek's r1 model distillated from ChatGPT

答:

As of now, I don’t have any specific, confirmed information on a model called “Deepseek’s R1� or whether it is distilled from ChatGPT. “Distillation� typically refers to the process of transferring knowledge from a larger model (like GPT-3 or GPT-4) to a smaller, more efficient model. Companies and research labs often use distillation techniques to create smaller models that retain much of the larger model’s performance.

 

If Deepseek’s R1 is based on OpenAI’s models, it’s possible that it could be a fine-tuned or distilled version of a model like ChatGPT, but without more detailed information or an official source, it’s hard to confirm. Would you like me to look up more details?