https://www.deeplearning.ai/the-batch/ai-transformed/
“Noam Shazeer helped spark the latest NLP revolution. He developed the multi-headed self-attention mechanism described in “Attention Is All You Need,” the 2017 paper that introduced the transformer network. That architecture became the foundation of a new generation of models that have a much firmer grip on the vagaries of human language. Shazeer’s grandparents fled the Nazi Holocaust to the former Soviet Union, and he was born in Philadelphia in 1976 to a multi-lingual math teacher turned engineer and a full-time mom. He studied math and computer science at Duke University before joining Google in 2000. Below, he discusses the transformer and what it means for the future of deep learning.”
大多是名牌大學的。怎麽沒有一個美國人?想說明什麽?查了下,Noam Shazeer出生於費城,上的Duke, 算美國人吧
本文內容已被 [ sportfan ] 在 2024-04-04 21:25:55 編輯過。如有問題,請報告版主或論壇管理刪除.