人工智能基础模型支撑平台与评测技术
  
  
  
  
    
    
      
    
    11月 1, 2022
  
  
  
  
  
  
  
  
  
  
  
    
  本项目以大模型建设为抓手,以促进我国超大规模智能模型前沿技术发展、促进人工智能赋能经济社会发展为目标,构建国际领先的基础模型开源技术体系,建立以基础模型为核心的人工智能开放创新生态。
出版物
        我们提出了 AdaInfer,一种基于统计特征自适应提前终止 LLM 推理的轻量算法,可在不修改模型的情况下减少 最高 43% 的计算量,性能下降不足 1%。
      
    
    
    
        
  
      Siqi Fan, 
      Xin Jiang, 
      Xuying Meng, 
      Peng Han, 
      Shuo Shang, 
      Aixin Sun, 
      Yequan Wang
      
      
    
        我们提出了 Few-Shot Detector (FSD),一种通过学习度量空间、仅需少量样本即可识别未见伪造图像的检测器,能在无需再训练的情况下实现 准确率提升 11.6% 并保持强泛化能力。
      
    
    
    
        
  
      Shiyu Wu, 
      Jing Liu, 
      Jing Li, 
      Yequan Wang
      
      
    
        As scaling laws underscore the potential of increasing model sizes, the academic community has intensified its investigations into LLMs with capacities exceeding 50 billion parameters. This technical report builds on our prior work with Tele-FLM (also known as FLM-2), a publicly available 52-billion-parameter model.
      
    
    
    
        
  
      Xiang Li, 
      Yiqun Yao, 
      Xin Jiang, 
      Xuezhi Fang, 
      China Telecom, 
      Yequan Wang, 
      Zhongjiang He, 
      Zhongyuan Wang, 
      Xuelong Li, 
      Tiejun Huang
      
      
    
        To lower the computional cost of training large model, we focus on speeding up pre-training by progressively growing from a small Transformer structure to a large one.
      
    
    
    
        
  
      Yiqun Yao, 
      Zheng Zhang, 
      Jing Li, 
      Yequan Wang