Yequan Wang (王业全)

Yequan Wang (王业全)

Researcher, Team Leader

Beijing Academy of Artificial Intelligence (BAAI)


Dr. Wang Yequan is a researcher at Beijing Academy of Artificial Intelligence (BAAI). His research interests mainly include large models. He leads the Recognition Team, aiming to build lower-cost but more powerful large models and conduct relevant large model research, eventually achieving the goal of AGI.

Dr. Wang received Ph.D. in Computer Science from Tsinghua University, supervised by Prof. Xiaoyan Zhu and Minlie Huang. From Sep. 2017 to Sep. 2018, Dr. Wang studied at Nanyang Technological University as a Joint Ph.D. Candidate, supervised by Associate Prof. Aixin Sun, who is also the Assistant Chair (Academic).

Dr. Wang has been recognized as the 2022 AI 2000 Most Influential Scholar Honorable Mention in Natural Language Processing.

Team Philosophy for Large ModelsBoth system capabilities and research capabilities are essential. Without system capabilities, it is not possible to develop large models. Without research capabilities, one can only follow in the footsteps of others. In the case where the leader in large model development chooses to close the source code, it becomes impossible to make further breakthroughs.

Google Scholar Citations: 3,000+

ORCID: 0000-0001-7530-6125

  • Large Model
  • Emboddied AI
  • Natural Language Processing
  • PhD in Artificial Intelligence, 2014-2019

    Tsinghua University, Beijing

  • Joint PhD Program, 2017-2018

    Nanyang Technological University, Singapore


FLM Family
The FLM series of large models are a set of large models completed by the Cofe-AI team, including FLM-2, FLM-101B, and FreeLM. The core technology mainly consists of growth techniques, loss prediction technologies, and the FreeLM framework, among others.
Arabic Language Model (ALM 1.0)
We develop and open Arabic Language Model (ALM).
Implicit Sentiment Analysis on Complicated Web Text
National Science Foundation of China (NSFC, 62106249)

Recent Publications

(2022). CORT: A New Baseline for Comparative Opinion Classification by Dual Prompts. In Findings of the EMNLP 2022.

PDF Cite Code Dataset Project

(2022). CofeNet: Context and Former-Label Enhanced Net for Complicated Quotation Extraction. In COLING 2022.

PDF Cite Code Dataset Project

(2022). Packet Representation Learning for Traffic Classification. In KDD 2022.

PDF Cite Code DOI

(2022). Interactive Information Extraction by Semantic Information Graph. In IJCAI 2022.

PDF Cite Code Dataset

(2022). A Dual-Channel Framework for Sarcasm Recognition by Detecting Sentiment Conflict. In Findings of NAACL 2022.

PDF Cite Code Dataset

(2019). Path Travel Time Estimation using Attribute-related Hybrid Trajectories Network. In CIKM 2019.


(2019). Aspect-level Sentiment Analysis using AS-Capsules. In WWW 2019.

PDF Cite Code Dataset DOI

(2018). Sentiment Analysis by Capsules. In WWW 2018.

PDF Cite Code Dataset DOI

(2016). Attention-based LSTM for Aspect-level Sentiment Classification. In EMNLP 2016.

PDF Cite Code Dataset


Excellent Ph.D Graduate of the Department of Computer Science, Tsinghua University