Titanium Media App reported on September 2 that at the "Large-scale Pre-training Model" theme forum of the World Artificial Intelligence Conference hosted by Alibaba Dharma Academy, Zhou Jingren, Vice President of Dharma Academy, released Alibaba's latest "Tongyi" large model series. , and announced that the relevant core models are open sourced to global developers. It is understood that Tongyi has built the industry's first unified AI base and built a hierarchical artificial intelligence system with large and small models, which will provide advanced infrastructure for AI to move from perceptual intelligence to knowledge-driven cognitive intelligence. For example, the M6-OFA model in the general meaning unified base is used as a single model. Without introducing new structures, it can simultaneously process more than 10 single-modality and Cross-modal tasks, and the effect has reached the international leading level. In addition, the core models and capabilities such as the AliceMind-PLUG language large model, the unified multimodal understanding and generation model AliceMind-mPLUG, the multimodal unified base model M6-OFA, and the S4 framework, the key technology for the implementation of super large models, in the Tongyi large model series have been completed. Open source for global developers, the latest Wenshengtu large model will be open for experience in the near future.
media coverage
Titanium Media IT Home
Related events
- Alibaba releases “Tongyi” large model series, and the core model is open source2022-09-02
- Alibaba, Zhiyuan Research Institute, and Tsinghua University jointly released the largest pre-training AI model in China “Wenhui” 2021-01-12
- Tencent Youtu open sourced the industry’s first 3D medical imaging big data pre-training model2019-08-07
- Didi Open Source Natural Language Understanding Model Training Platform Delta 2019-08-03
- DeepMind and Waymo collaborate to improve AI accuracy and accelerate model training2019-07-26
This article is reproduced from: https://readhub.cn/topic/8iYXwByGR1W
This site is for inclusion only, and the copyright belongs to the original author.