Skip to main content

Deep Dive: Knowledge Atlas (HKEX: 2513) — The GLM Architect and China’s AGI Race

By: Finterra
Photo for article

As of February 10, 2026 | For Finterra.com


Introduction

In an unprecedented milestone for the global AI industry, Knowledge Atlas Technology Joint Stock Co., Ltd. (HKEX: 2513) — branded internationally as z.ai and domestically as Zhipu AI — became the world’s first pure-play foundation model developer to go public on January 8, 2026. With a $6.6 billion IPO valuation and a market cap exceeding $19 billion by mid-February, the company has emerged as a cornerstone of China’s “New Quality Productive Forces” initiative and a critical player in the race toward Artificial General Intelligence (AGI).

Zhipu AI’s flagship contribution is the GLM (General Language Model) series, a family of large language models distinguished by a unique blank-filling training objective and 2D positional encoding — architectural innovations that differentiate it from both GPT-style decoders (e.g., OpenAI) and encoder-decoder frameworks (e.g., T5). Its GLM-4.7 model outperforms GPT-4o and Claude 4 Sonnet on SWE-bench Verified, while the upcoming GLM-5 promises 745B parameters and deep multi-step reasoning.

This deep dive explores the company’s historical roots at Tsinghua University, its model-driven business model, its aggressive hardware sovereignty strategy in the face of U.S. sanctions, and its positioning in one of the world’s most dynamic AI ecosystems. We analyze its financial trajectory, competitive landscape, regulatory headwinds, and the investor frenzy that followed its landmark IPO — providing a comprehensive framework for understanding Zhipu AI’s present impact and future potential.


Historical Background

Founding and Academic Genesis (2019–2021)

Zhipu AI traces its lineage to the Knowledge Engineering Group (KEG) at Tsinghua University. In 2019, Professor Tang Jie and Professor Li Juanzi — leaders in natural language processing and knowledge representation — spun off a research project aimed at closing the performance gap between Chinese and English models in large-scale pre-training. Their core hypothesis: standard GPT-style causal decoding suffered from token-level bias against Chinese, a language with dense meaning-per-character and complex semantics.

The solution was the General Language Model (GLM) architecture, introduced in 2021. Unlike BERT (encoder-only) or GPT (decoder-only), GLM used an autoregressive blank infilling objective: it masked continuous spans of tokens and reconstructed them sequentially, using 2D positional embeddings to distinguish between input and generation phases. This unified architecture delivered strong performance on both natural language understanding (NLU) and generation (NLG), laying the foundation for future dominance.

The first open-sourced milestone came in August 2022: GLM-130B, a bilingual (Chinese/English) 130B-parameter model trained on 400B tokens. With MIT-style openness (though under early usage restrictions), GLM-130B became a popular choice for Chinese researchers and developers seeking an alternative to GPT-3.

Commercialization and the Rise of the AI Tigers (2023–2024)

In 2023, Zhipu AI launched ChatGLM-6B, a compact, GPU-friendly variant optimized for consumer hardware. Its Apache 2.0 license and 6GB VRAM requirement democratized large-model development across China, catalyzing an ecosystem of startups, governments, and enterprises building on top of its APIs and frameworks.

The financial and strategic inflection point arrived in mid-2023: Zhipu raised RMB 2.5 billion (US$342M) in Series B funding, led by Meituan, Alibaba, and Tencent — the “Big Three” Chinese tech platforms. This round cemented Zhipu’s status as the “oldest” among China’s “Six AI Tigers,” positioning it to compete directly with Baidu (ERNIE) and Alibaba (Qwen) in the enterprise B2B market.

The Sovereign AI Pivot and IPO (2025–2026)

The U.S. Department of Commerce’s January 2025 addition of Zhipu AI to the Entity List marked a turning point. Cut off from NVIDIA H100/H200 chips, the company accelerated its “sovereign AI” strategy — retraining flagship models like GLM-Image and GLM-4.6 entirely on Chinese hardware (Huawei Ascend 910C, Cambricon MLU, Moore Threads MTT S800).

This operational pivot paid off: by December 2025, Zhipu had filed for an IPO on the Hong Kong Stock Exchange. On January 8, 2026, it debuted at HK$116.20, raising $558 million in the largest AI foundation model IPO to date. Post-IPO, the stock surged 173% in one month, peaking at HK$317.80, driven by a combination of retail enthusiasm, cornerstone investor backing, and a JPMorgan “Overweight” rating with a HK$400 price target.


Business Model

Zhipu AI operates a Model-as-a-Service (MaaS) business model, targeting enterprise and developer markets with a tiered monetization strategy.

Revenue Streams

  • Enterprise B2B (≈95% of 2024 revenue):

    • On-prem/Privatized Cloud: High-margin deployments for state-owned enterprises (SOEs), government agencies, and financial institutions. Revenue for 2024: RMB 263.7M (84.5% of total).
    • API & SDK Licenses: Per-call or annual enterprise API access; 30-fold YoY growth in 2024.
  • Consumer B2C (≈5% of 2024 revenue):

    • Zhipu Qingyan App: Free chatbot with optional premium features.
    • GLM Coding Plan: $3/month subscription for developers; 150,000+ users by Q1 2026.
    • Developer Tools: MIT-licensed model weights, AutoGLM agent framework.

Pricing and Unit Economics

  • Gross Margins (2024): 56% overall — but 80%+ for on-premise, versus 0–5% for public API (due to compute subsidies).
  • Burn Efficiency: 70% of R&D spend (RMB 1.55B in 2024) covered compute and cloud infrastructure. Zhipu’s edge lies in algorithmic efficiency: its MoE models (e.g., GLM-4.7: 355B total, 32B active) achieve high accuracy with fewer active parameters, reducing inference costs.

Go-to-Market Strategy

Zhipu employs a “dual-track” GTM approach:

  1. Enterprise “Top-Down”: Direct sales teams embedded with SOEs and provincial governments; contracts often bundled with hardware (Ascend servers) and support services.
  2. Developer “Bottom-Up”: Open-source models, aggressive API pricing, and integration with popular dev tools (Cursor, Cline, VS Code) to drive organic adoption.

Stock Performance Overview

Period Stock Price (HKD) Change vs. IPO Market Cap (HKD)
IPO Price (Jan 8, 2026) HK$116.20 HK$57.89B
First Close HK$131.50 +13.2% HK$74.12B
Jan 16 Peak (Interim) HK$202.40 +74.3% HK$110.06B
Feb 9 ATH HK$287.80 +147.7% ~HK$135.6B
Feb 10 Close HK$317.80 +173.5% ~HK$150.1B
  • Retail Demand: IPO oversubscribed 1,159x; 20% allocation to retail.
  • Institutional Backing: Cornerstone investors included Taikang Life, JSC International, and GF Fund.
  • Benchmark Comparison: Outperformed the Hang Seng Tech Index (HSTECH), which fell ~1.7% in the same period.

Financial Performance

Metric (RMB Millions) FY2022 FY2023 FY2024 H1 2025
Total Revenue 57.4 119.2 (est.) 312.4 190.9
YoY Revenue Growth ~108% ~162% 325% (vs H1 2024)
Gross Margin ~48% ~52% 56% 51.5%
Net Loss (97.0) (580.0) (2,470.0) (2,360.0)
R&D Spend 84.0 410.0 2,200.0 1,590.0
Cash & Equivalents ~400 ~1,200 2,740.0 2,550.0
Valuation (Pre-IPO) $1.0B $2.8B $4.0B $6.6B (IPO)

Key Insights

  • R&D Intensity: R&D spending equaled 705% of total 2024 revenue, with 70% allocated to compute infrastructure.
  • Runway: Pre-IPO, Zhipu had ~8–10 months of runway (burn rate: RMB 300M/month).
  • Use of IPO Proceeds: 70% to R&D (GLM-5 and beyond), 10% to MaaS optimization, 10% to global expansion.

Leadership and Management

Executive Team

  • CEO & Executive Director: Dr. Zhang Peng — Tsinghua PhD, former KEG researcher. Known for rational, research-first leadership and a focus on AGI as the ultimate goal.
  • Co-founder & Non-exec Director: Prof. Li Juanzi — Professor at Tsinghua, continues to lead foundational research through the KEG Lab.
  • Chairman & Co-founder: Dr. Liu Debing — Former Technicolor (China) executive; oversees state-level alignment and corporate governance.
  • Chief Scientist: Prof. Tang Jie — Architect of the GLM design; now focuses on long-term model roadmap and AGI theory.

Board Composition (2026)

Name Role Background
Liu Debing Chairman & Exec Dir Co-founder, Tsinghua engineer
Zhang Peng Exec Dir CEO, former KEG researcher
Li Juanzi Non-exec Dir Co-founder, Tsinghua Professor
Yang Qiang Independent Non-exec Dir HKUST AI expert (Transfer Learning, Federated Learning)
Xie Deren Independent Non-exec Dir Tsinghua Accounting Professor
Li Jiaqing Non-exec Dir Legend Capital representative

Governance and Strategy

Zhipu AI is widely recognized as a “national champion” aligned with China’s 15th Five-Year Plan and “New Quality Productive Forces” initiatives. Its governance emphasizes compliance (CAC, MIIT, CSRC), data security (PIPL), and hardware sovereignty (Ascend, Cambricon). The leadership has publicly emphasized “cognitive supremacy” over raw scale, positioning Zhipu’s path to AGI as algorithmic — not just computational — advancement.


Products, Services, and Innovations

The GLM Model Series: Evolution and Capabilities

Model Release Parameters Context License Key Innovation
GLM-1 2021 10B 1K Academic Blank-filling objective, 2D position encoding
GLM-130B Aug 2022 130B 2K MIT First bilingual (ZH/EN) model; open-source
ChatGLM-6B Mar 2023 6.2B 2K Apache 2.0 GPU-friendly for local inference
GLM-4 Jan 2024 ~100B+ 128K Proprietary “All Tools” (web, Python, image gen)
GLM-4.5 Jul 2025 355B (MoE) 128K MIT “Thinking Mode” hybrid reasoning
GLM-4.7 Dec 2025 400B (MoE) 200K MIT SOTA on SWE-bench, coding, math
GLM-4.7-Flash Jan 2026 31B (MoE) 128K MIT Runs on consumer GPUs (RTX 3090)
GLM-5 Feb 2026 745B (MoE) 256K+ Anticipated DSA (Deep Reasoning Architecture), AGI Stage 1

z.ai Platform (Global Brand, 2025–2026)

  • Bigmodel.cn: API platform; 2.7 million paying developers and 12,000+ enterprise clients.
  • Zhipu Qingyan: Consumer app with video calling and multimodal input.
  • AutoGLM: First mobile agent capable of navigating app UIs (e.g., WeChat, Didi, Meituan) to execute multi-step tasks.
  • GLM-Image: First SOTA image generation model trained solely on Huawei Ascend 910C chips.

Intellectual Property and R&D

  • Over 300 patents filed in China (as of Q4 2025), covering 2D positional encoding, blank-filling training, and MoE routing.
  • 70% of funding post-IPO dedicated to Frontier AGI Research, with emphasis on multi-turn agentic reasoning and self-supervised self-critique.

Competitive Landscape

Company Model Series Strength Weakness
Zhipu AI (Z.ai) GLM MoE efficiency, hardware sovereignty, MIT licensing, SOTA coding (GLM-4.7) Low B2C conversion, high compute costs
Baidu ERNIE 4.5/5.0 Search + knowledge graph integration, deep Chinese idiomatic fluency Slower inference, weaker tool use
Alibaba Qwen 3/3.5 Massive multilingual coverage (119+), high-throughput 1M+ context Less focus on agentic workflows
DeepSeek V3/R1 Aggressive pricing, strong math (AIME), venture backing Less enterprise deployment, unprofitable
Tencent HunYuan Enterprise + gaming ecosystem integration Limited transparency, proprietary stack

Market Position

  • China Market Share (IDC, 2024): ~18% — ranked #3 (after Baidu and Alibaba).
  • Global LLM Positioning: Among top 10 foundation models by open weights and closed performance (per Hugging Face Leaderboard).
  • Unique Edge: Only model family trained entirely on Chinese hardware at SOTA scale (GLM-4.7, GLM-5).

Industry and Market Trends

  • New Quality Productive Forces: China’s national policy prioritizes AI that boosts industrial efficiency — Zhipu’s SOE and manufacturing deployments align perfectly.
  • Model Compression & Edge Deployment: Zhipu’s GLM-4.7-Flash targets 2026 consumer hardware; Samsung Galaxy S25 (China) includes Zhipu’s edge model.
  • Global South Expansion: Zhipu leads the “Alliance for Independent Large Model Co-construction” with ASEAN and Belt & Road nations.
  • MoE Dominance: Most 2025–2026 releases (GLM-4.5+, Qwen 3.5, ERNIE 5.0) use MoE — Zhipu’s first-mover advantage in MoE training on Ascend chips is critical.
  • Compute Price War: DeepSeek’s aggressive API pricing (Q4 2025) pressured Zhipu’s public cloud margins, driving Zhipu to double down on high-margin enterprise contracts.

Risks and Challenges

  • U.S. Entity List (Jan 2025): Bans Zhipu from NVIDIA H100/H200 and U.S. cloud inference; forces reliance on lower-efficiency domestic chips.
  • Profitability Lag: Net loss of RMB 2.47B in 2024; R&D burn remains >700% of revenue. Path to breakeven is 2027–2028.
  • Geopolitical Decoupling: Limited ability to deploy GLM-5 in U.S./EU markets; restricted model export under China’s Export Control Law.
  • Regulatory Scrutiny (China): CAC-mandated security assessments for every model update; PIPL compliance for user data.
  • Valuation Volatility: Current P/S of 150x (2024) and 39x (2025E) leaves stock vulnerable to earnings disappointment.

Opportunities and Catalysts

  • GLM-5 Launch (Feb 2026): Anticipated to rival GPT-5 in AGI-stage reasoning — potential catalyst for 30–50% stock re-rating.
  • SOE Procurement Mandates: 70% of government AI spending must use “First Batch” domestic models — Zhipu holds largest share.
  • Hardware Partnerships: Huawei (Ascend), Cambricon, and Moore Threads offer subsidized compute vouchers; Zhipu receives MIIT “AI Tiger” subsidies covering ~30% of power costs.
  • Global Developer Adoption: MIT licensing and open weights accelerate integrations in OpenRouter, Hugging Face, and ASEAN cloud providers.
  • Runway Extension: IPO proceeds extend runway to >36 months; capital allows aggressive R&D without secondary dilution.

Investor Sentiment and Analyst Coverage

Analyst Ratings (Post-IPO, as of Feb 10, 2026)

Firm Rating Price Target (HKD) Note
JPMorgan Overweight 400 “Top pick for global AI value creation”
Goldman Sachs (Asia) Buy 42.50 “Proprietary Knowledge Graph LLM” advantage
Morgan Stanley Overweight 38.00 Enterprise integration in GBA
HSBC Global Research Hold 31.00 Compute cost concerns

Institutional & Retail Activity

  • Cornerstone Investors (5.8 months lock-up): Taikang Life, JSC International, GF Fund (~68.6% of offering).
  • Hedge Funds: 3W Fund (3.8% long), WT Asset Management (added 1.2M shares Jan 2026).
  • Retail Sentiment: 1,159x oversubscription; StockStreet and LittleWhitePanda bullish, though caution noted at HK$36 resistance.

Finterra-Style Metrics (Est.)

Metric Value
Implied FY2026 P/E 33.5x
P/S (2025E) 9.2x
EV/EBITDA 24.5x
Implied EPS (FY26) HK$1.12
Cash Runway >36 months (post-IPO)

Regulatory, Policy, and Geopolitical Factors

  • U.S. Entity List (Jan 2025): Blocked H100/H200 access; forced domestic chip migration (Ascend 910C).
  • China CAC Regulations: GenAI Service Measures (2023) and TC260-003 (2024) mandate model registration, human-in-the-loop safety testing, and keyword filtering.
  • Export Control Law (2025): Model weights classified as “restricted exports” — GLM-5 can only be hosted on Chinese mainland or Hong Kong servers.
  • Cross-Border Data Flow (2025 Updates): Tightened for model weights; Zhipu uses “Hong Kong Gateway” to host APIs while core compute remains in mainland.
  • Policy Dividends:
    • “AI Tiger” Support (MIIT): Grants cover ~30% of compute costs.
    • East Data West Compute (东数西算): Zhipu’s clusters in Gansu/Guizhou use cheap hydroelectric power.

Conclusion

Knowledge Atlas (HKEX: 2513) is not merely a stock — it is a national infrastructure play. Its GLM models represent a rare case where algorithmic innovation (blank-filling, 2D positional encoding) translated directly into market leadership and operational sovereignty. The company has turned U.S. sanctions into a catalyst for domestic silicon adoption, and its focus on MoE efficiency positions it well for a future where compute scarcity — not abundance — defines competitive advantage.

Investors face a binary narrative: either Zhipu’s high burn and valuation will be justified by GLM-5’s AGI breakthrough and SOE dominance, or the stock will correct toward more traditional SaaS multiples in a maturing AI market. Key watchpoints for the next 90 days include:

  • GLM-5 performance benchmarks (C-Eval, AIME, SWE-bench)
  • Enterprise renewal rates and avg. contract value (ACV) growth
  • MIIT subsidies and Ascend chip yield improvements

At its current price, Zhipu offers explosive upside if AGI milestones are hit — but substantial risk if hardware bottlenecks or regulatory shifts slow execution. For investors with a multi-year horizon and high-risk tolerance, the company remains a compelling, high-conviction proxy for the global AI arms race — one that may well define the next decade of tech leadership.


This article is for informational purposes only and is not financial advice. Finterra.com does not hold positions in any securities mentioned. Data as of February 10, 2026.

Recent Quotes

View More
Symbol Price Change (%)
AMZN  206.96
-1.76 (-0.84%)
AAPL  273.68
-0.94 (-0.34%)
AMD  213.57
-2.43 (-1.13%)
BAC  55.39
-1.02 (-1.81%)
GOOG  318.63
-5.77 (-1.78%)
META  670.72
-6.50 (-0.96%)
MSFT  413.39
-0.21 (-0.05%)
NVDA  188.54
-1.50 (-0.79%)
ORCL  159.89
+3.30 (2.11%)
TSLA  425.21
+7.89 (1.89%)
Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the Privacy Policy and Terms Of Service.