—————-
prenwxc 發表評論於 2025-01-29 06:00:00 是distill 不是distall. "OpenAI found evidence of “distillation,” which it believes came from DeepSeek. Distillation is a process where AI firms use an already trained large AI model to train smaller models. The “student” models will match similar results to the “teacher” AI in specific tasks."’
—————-
prenwxc 發表評論於 2025-01-29 06:00:00 是distill 不是distall. "OpenAI found evidence of “distillation,” which it believes came from DeepSeek. Distillation is a process where AI firms use an already trained large AI model to train smaller models. The “student” models will match similar results to the “teacher” AI in specific tasks."
是distill 不是distall. "OpenAI found evidence of “distillation,” which it believes came from DeepSeek. Distillation is a process where AI firms use an already trained large AI model to train smaller models. The “student” models will match similar results to the “teacher” AI in specific tasks."_------------------------ 回複網友評論 令胡衝 -----------DeepSeek沒有Distall ChatGPT。句號。它也distall不了,OpenAI這四年來任何模型和數據都沒有來源。DeepSeek隻能蒸餾自己的模型,去微調其它小模型。 -----------------
——————-
prenwxc 發表評論於 2025-01-29 05:09:28 deepseek 本身就是建立在 open source LLM 之上的. openAI, meta... 都是貢獻者。deepseek used distillation to learn from other models, and used other models in reinforced learning. 已經有人試了問deepseek 自己是誰,who are you, 它的回答是我是百分百的Microsoft. 可見copilot/chatgpt 影響之深
deepseek 本身就是建立在 open source LLM 之上的. openAI, meta... 都是貢獻者。deepseek used distillation to learn from other models, and used other models in reinforced learning. 已經有人試了問deepseek 自己是誰,who are you, 它的回答是我是百分百的Microsoft. 可見copilot/chatgpt 影響之深
回複網友評論 令胡衝 -----------
It collects all dataset from everywhere. GPT generated dataset is only a small part - no different f...
-----------------