Original link: https://blog.xiaoz.org/archives/18556
ChatGLM-6B is an open source dialogue language model based on the General Language Model (GLM) architecture, supporting Chinese and English bilingual. The model is optimized using techniques similar to ChatGPT. It has been trained in Chinese and English with 1T identifiers, supplemented by techniques such as supervised fine-tuning, feedback self-help and human feedback reinforcement learning, with a total of 6.2 billion parameters. ChatGLM-6B developed by Tsinghua University
This article is transferred from: https://blog.xiaoz.org/archives/18556
This site is only for collection, and the copyright belongs to the original author.