王业全的学术主页
王业全的学术主页
主页
项目
论文
专利
报告
联系方式
浅色
深色
自动
中文 (简体)
中文 (简体)
English
Large Model
Masked Structural Growth for 2x Faster Language Model Pre-training
To lower the computional cost of training large model, we focus on speeding up pre-training by progressively growing from a small Transformer structure to a large one.
Yiqun Yao
,
Zheng Zhang
,
Jing Li
,
Yequan Wang
PDF
引用
代码
项目
项目
52B to 1T: Lessons Learned via Tele-FLM Series
As scaling laws underscore the potential of increasing model sizes, the academic community has intensified its investigations into LLMs with capacities exceeding 50 billion parameters. This technical report builds on our prior work with Tele-FLM (also known as FLM-2), a publicly available 52-billion-parameter model.
Xiang Li
,
Yiqun Yao
,
Xin Jiang
,
Xuezhi Fang
,
China Telecom
,
Yequan Wang
,
Zhongjiang He
,
Zhongyuan Wang
,
Xuelong Li
,
Tiejun Huang
PDF
引用
项目
Tele-FLM-1T
Tele-FLM
FLM 系列大模型
FLM 系列大模型是 Cofe-AI 团队牵头完成的一系列大模型,包含FLM-2 (Tele-FLM-1T, Tele-FLM-52B, FLM-2-52B-Instruct等),FLM-101B 和 FreeLM。其核心技术主要为生长技术,损失预测技术和 FreeLM 框架等。
PDF
阿拉伯语语言大模型 (ALM 1.0)
我们构建并开源了阿拉伯语语言大模型 (ALM 1.0)。
代码
数据集
引用
×