不信?我隨便找出一些:
… ) with generative pre-training on large-scale Chinese training … GB Chinese training data, is the largest Chinesepre-trained … CPM is a Transformer-based autoregressive language model…
Cited by 58 Related articles
… , pre-trained in multiple stages with large-scale Chinese and bilingual data. We also report generative models adopted from Transformer … the effectiveness generative pre-training. …
Cited by 47 Related articles
… , generative, domain-specific, and multimodal pre-trained … Since there are simplified and traditional Chinesetokens in … where the model consists of 12 transformer layers, with the hidden …
Cited by 13 Related articles
… with pre-trained language models. We also release a new large-scale Chinese knowledge … We argue that this is caused by the bias of the pre-trained language model (eg, common …
Cited by 16 Related articles
… that contains the largest Chinese pre-trained dialogue model … and is pre-trained on WDC-Dialogue, including 1.4B Chinese … Transformer-based encoder-decoder model on the …
Cited by 24 Related articles
… First, we adopt a decoder-only transformer architecture, which fits … Our DSGPT is pre-trained on a limited dataset, the Chinese … of pre-trained model, we also train a decoder transformer …
Cited by 8 Related articles
… Due to the randomness of the dropout mechanism in transformers, we can get two different sets of hidden features, and therefore, two different categorical distributions of dialog policy, …
Cited by 45 Related articles
… performances than general pre-trained models on biomedical … -specific generative pre-trained Transformerlanguage model … Transformer language model backbone, and is pre-trained …
Cited by 8 Related articles
Q Zhu, J Luo - Proceedings of the Design Society, 2022 - cambridge.org
… However, current generative design algorithms focus on diagrammatic or spatial concepts … This paper explores the uses of generative pre-trained transformers (GPT) for natural …
Cited by 14 Related articles
… [15] pre-trained on ImageNet … transformer-decoder [62], ie, GPT-2 [54], to develop the contextualized video summary controller for the text-based query embedding, using the pre-trained …
所有跟帖:
•
說事實,這些算法都是Google和OpenAI的發明創造。有什麽AI的算法是中國原創?
-Bob007-
♂
(0 bytes)
()
03/02/2023 postreply
14:56:45