All the funding deals in the AI chip market
Download our beautiful pitch about the AI chip market

In our AI chip market deck, you will find everything you need to understand the market
This page lists the most important fundraising deals in the AI chip market, in one simple place.
We found 44 funding deals for the AI chip market.
We refresh this AI chip market page every quarter, so the list stays up to date.
And if you want to better understand this new industry, you should get our beautiful slides covering the AI chip market.
Insights
- Recent AI chip market rounds are very large: several deals are above $500M, which is rare in most hardware markets and signals heavy capex needs.
- The AI chip market shows a clear split: GPU-focused players raise “scale” rounds, while inference-ASIC startups often raise smaller but faster cycles.
- China-focused AI chip market companies appear repeatedly with very large rounds, reflecting a push to build domestic data-center GPU supply.
- In the AI chip market, “inference” is now a core fundraising story: many newer deals mention LLM inference speed, latency, and serving cost.
- Photonics is a recurring AI chip market theme: multiple companies raise capital to use light for compute or interconnects in data centers.
- The AI chip market has many repeat fundraisers: the same company can show up across 2–3 different years as products move from R&D to production.
- Several AI chip market rounds mention “mass production” or “deployment,” which usually signals a shift from prototypes to real customer rollouts.
- Geography matters in the AI chip market: the list includes major activity across the US, China, Europe, South Korea, and Israel.

In our AI chip market deck, we show you long-term trends so you can make better decisions
Summary table of the funding deals for the AI chip market since 2023
We define the AI chip market as data-center accelerators whose primary purpose is to run AI workloads (training and inference).
We include GPUs, TPUs, and other AI accelerators/ASICs sold for deployment in servers used to train or serve machine-learning models.
We exclude general-purpose CPUs, networking and memory components, and endpoint/edge chips in phones, PCs, cars, and IoT devices.
We focused exclusively on pure players, defined as companies with at least 70–80% of their business directly tied to the AI chip market.
Our analysis has been done at a global level with a minimum funding threshold of $300k.
You can also read our detailed analysis to understand how funding activity in the AI chip market has evolved recently.
If you want a longer-term view, we also have a study of how funding activity in the AI chip market has changed over the years.
Also, you should know that we have a dedicated page, updated weekly, with all the latest fundraising deals in the AI chip market.
| Name | What they do | Amount in $ | Quarter | Source(s) |
|---|---|---|---|---|
| Etched | Builds transformer-only ASICs for faster LLM inference in data centers. | ~$500M | Q4 2025/Q1 2026 | Bloomberg, Yahoo Finance, SiliconANGLE |
| Mythic | Builds analog compute-in-memory chips for efficient AI inference. | $125M | Q4 2025 | Bloomberg, Mythic, SiliconANGLE |
| Moore Threads Technology | Designs domestic GPUs for AI training and inference in China. | $1,100M | Q4 2025 | CNBC, Yahoo Finance, 36Kr |
| Biren Technology - HK IPO | Builds data-center GPUs for AI training and inference. | $717M | Q4 2025 | Yahoo Finance, Startup News, Asia Tech Daily |
| MetaX Integrated Circuits | Builds GPUs for AI training and inference in data centers. | $596M | Q4 2025 | CNBC, Yahoo Finance, Wikipedia |
| Iluvatar CoreX Semiconductor | Designs cloud GPUs for AI training and inference. | $475M | Q4 2025 | Caixin Global, Tracxn, CapRoasia |
| Tsing Micro Intelligent Technology | Builds reconfigurable AI accelerators using a CGRA architecture. | $283M | Q4 2025 | Caixin Global, STCN, QbitAI |
| d-Matrix | Builds digital in-memory compute chips for LLM inference. | $275M | Q4 2025 | PR Newswire, Bloomberg, Data Center Dynamics |
| Rebellions | Builds AI inference NPUs for cloud data centers. | $253M | Q3-Q4 2025 | Rebellions, Data Center Dynamics, Korea Tech Desk |
| Groq | Builds LPUs for very fast AI inference in data centers. | $750M | Q3 2025 | Groq, Bloomberg, TechCrunch |
| Cerebras Systems | Builds wafer-scale processors for AI training and inference. | $1,100M | Q3 2025 | Business Wire, TechCrunch, Cerebras |
| FuriosaAI | Builds AI inference accelerators for data-center workloads. | $125M | Q3 2025 | FuriosaAI, Business Wire, Korea Tech Desk |
| VSORA | Builds high-performance chips for generative AI inference. | $46M | Q2 2025 | VSORA, GlobeNewswire |
| Arago | Builds photonic AI accelerators for data-center compute. | $26M | Q3 2025 | EE Times, Tech Funding News, Semi Engineering |
| Rain AI | Builds neuromorphic-style AI chips using digital in-memory compute. | $8.1M | Q1 2025 | Crunchbase, NeuromorphicCore |
| Biren Technology - Pre-IPO | Raised pre-IPO funding to scale data-center GPU development. | $207M | Q2 2025 | Yahoo Finance, Asia Tech Daily |
| Tenstorrent | Builds AI accelerators and RISC-V CPU IP for data centers. | $693M | Q4 2024 (Dec 2) | PR Newswire, TechCrunch |
| Moore Threads | Builds GPUs for AI training, inference, and HPC in data centers. | $736M | Q4 2024 (Nov-Dec) | 36Kr, SCMP |
| MatX | Builds AI chips optimized for large language models in data centers. | $80M | Q4 2024 (Nov 22) | TechCrunch, SiliconANGLE |
| HyperAccel | Builds LLM inference chips to reduce serving costs in data centers. | $40M | Q4 2024 (Dec) | SemiEngineering, Semiconductors Insight |
| Groq | Builds LPUs for low-latency LLM inference at scale. | $640M | Q3 2024 (Aug 5) | PR Newswire, Axios |
| Rebellions | Raised strategic funding to expand AI chip business overseas. | $15M | Q3 2024 (Jul) | US News |
| Iluvatar CoreX | Builds GPUs for AI workloads in Chinese cloud data centers. | Undisclosed | Q3 2024 (Sep 18) | Contxto, Tracxn |
| Axelera AI | Builds in-memory computing AI chips for data-center inference and HPC. | $68M | Q2 2024 (Jun 27) | Axelera AI, BusinessWire, DataCenterDynamics |
| Etched | Builds a transformer-only chip for high-speed LLM inference. | $120M | Q2 2024 (Jun 25) | TechCrunch, CNBC, Crunchbase News |
| NeuReality | Builds AI inference systems, including a data-center NAPU chip. | $20M | Q1 2024 (Mar 19) | BusinessWire, EE Times, Calcalist |
| Rebellions | Builds NPUs for AI inference in cloud data centers. | $124M | Q1 2024 (Jan 30) | TechCrunch, Bloomberg |
| Biren Technology | Builds data-center GPUs for AI training and inference in China. | $280M | Q4 2023 | Bloomberg, Tom's Hardware |
| Lightmatter | Builds photonic AI chips and interconnects for data centers. | $155M | Q4 2023 | BusinessWire, insideHPC |
| EnCharge AI | Builds analog in-memory chips for efficient AI inference. | $22.6M | Q4 2023 | TechCrunch, PRNewswire |
| Neurophos | Builds optical inference chips to reduce AI power use. | $7.2M | Q4 2023 | PRWeb, GeekWire, Silicon Catalyst |
| Enflame Technology | Builds GPUs for cloud AI training and inference in China. | $274M | Q3 2023 | Tracxn, CoinSpeaker |
| d-Matrix | Builds chiplets for efficient generative AI inference. | $110M | Q3 2023 | Crunchbase, insideHPC, d-Matrix |
| Sapeon | Builds NPUs to accelerate AI inference in data centers. | $45M | Q3 2023 | SemiEngineering |
| Lightmatter | Builds photonic hardware to speed up AI workloads. | $154M | Q2 2023 | TechCrunch, BusinessWire |
| Etched | Builds custom inference ASICs focused on transformer models. | $5.4M | Q1 2023 | Wikipedia, Primary VC |
| Moore Threads | Builds GPUs for AI training and cloud computing in data centers. | $215.4M | Q4 2022 | Tom's Hardware, DigiTimes, 36Kr |
| Axelera AI | Builds in-memory AI accelerators for inference and computer vision. | $27M | Q4 2022 | Axelera AI, Tech.eu |
| NEUCHIPS | Builds ASICs for recommendation inference in data centers. | $20M | Q4 2022 | GlobeNewswire, NEUCHIPS |
| Tenstorrent | Builds AI processors and RISC-V IP for training and inference. | $30M | Q3 2022 | Tracxn, StartupHub.ai |
| MetaX | Builds high-performance GPUs for AI training and inference. | $149.2M | Q2-Q3 2022 | EqualOcean, SemiEngineering, MetaX |
| Iluvatar CoreX | Builds general-purpose GPUs for AI training and inference. | $148.8M | Q2-Q3 2022 | SemiEngineering, Tracxn, Wikipedia |
| Luminous Computing | Builds photonics-based AI supercomputers for hyperscale data centers. | $105M | Q1 2022 | BusinessWire, SiliconANGLE, VentureBeat |
| Rebellions (Series A) | Builds domain-specific processors for AI inference in data centers. | $50M | Q2 2022 | TechCrunch, SemiEngineering |
| d-Matrix | Builds in-memory computing chips for transformer inference. | $44M | Q2 2022 | BusinessWire, SiliconANGLE, The Register |
| Rebellions (Series A Extension) | Raised funding to mass-produce AI chips for large models. | $22.8M | Q2-Q3 2022 | TechCrunch |
| Rain Neuromorphics | Builds analog neuromorphic AI chips using memristors. | $25M | Q1 2022 | Design & Reuse, EE Times, Euronews |
| EdgeCortix | Builds AI inference chips for edge and server deployments. | $8M | Q1 2022 | EdgeCortix, PR Newswire |

In our AI chip market deck, we will give you useful market maps and grids
Related blog posts
- What is the latest update in the AI chip market?
- What are the latest news in the AI chip market?
- What is the real market size of the AI chip market?
- What are the latest funding news in the AI chip market?
Who is the author of this content?
NEW MARKET PITCH TEAM
We track new markets so founders and investors can move fasterWe build living “market pitch” documents for emerging markets: from AI to synthetic biology and new proteins. Instead of digging through outdated PDFs, random blog posts, and hallucinated LLM answers, our clients get a clean, visual, always-updated view of what’s really happening. We map the key players, deals, regulations, metrics and signals that matter so you can decide faster whether a market is worth your time. Want to know more? Check out our about page.
How we created this content 🔎📝
At New Market Pitch, we kept seeing the same problem: when you look at a new market, the data is either missing, paywalled, or buried in 300-page reports that feel like they were written in the 80s. On the other side, LLMs and random blog posts give you confident answers with no sources, and sometimes they just make things up. That’s not good enough when you’re about to invest real money or launch a company.
So we decided to fix the experience. For each market we cover, we build a structured database and update it on a regular basis. We track funding rounds, fund memos, M&A moves, partnerships, new products, policy changes, and the real activity of startups and incumbents. Then we turn all of that into a clear “market pitch” that shows where the opportunities are and how people actually win in that space.
Every key data point is checked, sourced, and put back into context by our team. That’s how we can give you both speed and reliability: fast coverage of new markets, without the usual guesswork.