不信?我隨便找出一些:

來源: Bob007 2023-03-02 13:53:50 [] [舊帖] [給我悄悄話] 本文已被閱讀: 次 (44424 bytes)

CPM: A large-scale generative Chinese pre-trained language model

Z ZhangX HanH ZhouP KeY GuD YeY QinY Su… - AI Open, 2021 - Elsevier
… ) with generative pre-training on large-scale Chinese training … GB Chinese training data, is the largest Chinesepre-trained … CPM is a Transformer-based autoregressive language model…
 Cited by 58 Related articles 

Cpt: A pre-trained unbalanced transformer for both chinese language understanding and generation

Y ShaoZ GengY Liu, J Dai, H YanF Yang… - arXiv preprint arXiv …, 2021 - arxiv.org
… , pre-trained in multiple stages with large-scale Chinese and bilingual data. We also report generative models adopted from Transformer … the effectiveness generative pre-training. …
 Cited by 47 Related articles 

Mengzi: Towards lightweight yet ingenious pre-trained models for chinese

Z ZhangH Zhang, K Chen, Y Guo, J Hua… - arXiv preprint arXiv …, 2021 - arxiv.org
… , generative, domain-specific, and multimodal pre-trained … Since there are simplified and traditional Chinesetokens in … where the model consists of 12 transformer layers, with the hidden …
 Cited by 13 Related articles 

From discrimination to generation: knowledge graph completion with generative transformer

X XieN Zhang, Z Li, S Deng, H Chen, F Xiong… - … Proceedings of the …, 2022 - dl.acm.org
… with pre-trained language models. We also release a new large-scale Chinese knowledge … We argue that this is caused by the bias of the pre-trained language model (eg, common …
 Cited by 16 Related articles 

Eva: An open-domain chinese dialogue system with large-scale generative pre-training

H ZhouP KeZ ZhangY GuY ZhengC Zheng… - arXiv preprint arXiv …, 2021 - arxiv.org
… that contains the largest Chinese pre-trained dialogue model … and is pre-trained on WDC-Dialogue, including 1.4B Chinese … Transformer-based encoder-decoder model on the …
 Cited by 24 Related articles 

DSGPT: Domain-specific generative pre-training of transformers for text generation in e-commerce title and review summarization

X Zhang, Y JiangY Shang, Z Cheng, C Zhang… - Proceedings of the 44th …, 2021 - dl.acm.org
… First, we adopt a decoder-only transformer architecture, which fits … Our DSGPT is pre-trained on a limited dataset, the Chinese … of pre-trained model, we also train a decoder transformer …
 Cited by 8 Related articles 

Galaxy: A generative pre-trained model for task-oriented dialog with semi-supervised learning and explicit policy injection

W HeY DaiY ZhengY Wu, Z Cao, D Liu… - Proceedings of the …, 2022 - ojs.aaai.org
… Due to the randomness of the dropout mechanism in transformers, we can get two different sets of hidden features, and therefore, two different categorical distributions of dialog policy, …
 Cited by 45 Related articles 

BioGPT: generative pre-trained transformer for biomedical text generation and mining

R Luo, L Sun, Y XiaT QinS Zhang… - Briefings in …, 2022 - academic.oup.com
… performances than general pre-trained models on biomedical … -specific generative pre-trained Transformerlanguage model … Transformer language model backbone, and is pre-trained …
 Cited by 8 Related articles 

Generative pre-trained transformer for design concept generation: an exploration

Q Zhu, J Luo - Proceedings of the Design Society, 2022 - cambridge.org
… However, current generative design algorithms focus on diagrammatic or spatial concepts … This paper explores the uses of generative pre-trained transformers (GPT) for natural …
 Cited by 14 Related articles 

Gpt2mvs: Generative pre-trained transformer-2 for multi-modal video summarization

JH Huang, L Murn, M MrakM Worring - Proceedings of the 2021 …, 2021 - dl.acm.org
… [15] pre-trained on ImageNet … transformer-decoder [62], ie, GPT-2 [54], to develop the contextualized video summary controller for the text-based query embedding, using the pre-trained …

所有跟帖: 

說事實,這些算法都是Google和OpenAI的發明創造。有什麽AI的算法是中國原創? -Bob007- 給 Bob007 發送悄悄話 (0 bytes) () 03/02/2023 postreply 14:56:45

請您先登陸,再發跟帖!

發現Adblock插件

如要繼續瀏覽
請支持本站 請務必在本站關閉/移除任何Adblock

關閉Adblock後 請點擊

請參考如何關閉Adblock/Adblock plus

安裝Adblock plus用戶請點擊瀏覽器圖標
選擇“Disable on www.wenxuecity.com”

安裝Adblock用戶請點擊圖標
選擇“don't run on pages on this domain”