What are the latest funding news in the AI chip market? (March 2026)
Download our beautiful pitch about the AI chip market

In our AI chip market deck, you will find everything you need to understand the market
The AI chip market is booming, with billions of dollars flowing into startups building the next generation of data-center accelerators.
From wafer-scale processors to photonic inference chips, a wave of challengers is targeting Nvidia's dominance across training and inference workloads.
In just the past few months, twelve companies collectively raised over $3.4 billion to build faster, cheaper, and more energy-efficient silicon for AI data centers.
And if you want to better understand this new industry, you can download our pitch covering the AI chip market.
Insights
- Inference is the dominant theme: 9 of the 12 deals in this cohort are inference-focused companies, reflecting the industry-wide shift from training compute to serving compute as the primary cost driver in AI operations.
- Photonic AI chips are having a breakout moment, with three companies (Olix, Neurophos, Arago) raising a combined $356M on the thesis that light-based compute can overcome the power and memory-bandwidth bottlenecks of traditional silicon.
- The median round size in this cohort is $242.5M, a figure skewed upward by megadeals from Cerebras ($1B) and two $500M rounds, signaling that investors are writing much larger checks as the capital intensity of chip development becomes clearer.
- Sovereign AI is emerging as a distinct customer segment: both Positron (Qatar Investment Authority) and d-Matrix (QIA, EDBI) received strategic investment from state-linked funds, and Mastiska is building explicitly for GCC sovereign AI infrastructure.
- Pre-product companies are commanding serious capital: MatX ($500M, shipping 2027) and Etched.ai ($500M, pre-deployment) together raised $1B without shipping a single chip, illustrating the winner-takes-most dynamic investors expect in accelerator silicon.
- Geographic diversification is accelerating: the UK (Olix, Fractile), South Korea (Rebellions), UAE (Mastiska), and France (Arago) are all fielding competitive chip ventures, challenging the US-centric narrative of AI infrastructure.
- Strategic investors are deeply embedded: Intel Capital (SambaNova), AMD (Cerebras), Arm and Samsung Ventures (Rebellions), and Marvell (MatX) all participated, suggesting incumbents are hedging by backing potential disruptors rather than only competing with them.
- The Series B stage is emerging as the critical inflection point, with MatX ($500M) and Positron ($230M) both raising large Series B rounds to fund the transition from chip design to actual TSMC manufacturing runs.
- Cerebras reached a $23B post-money valuation on its Series H, making it the most valuable pure-play AI accelerator startup in the world and a benchmark against which all other valuations in this cohort are implicitly measured.
- The NATO Innovation Fund's participation in Fractile's round is a notable signal that Western defense and security institutions are treating domestic AI inference silicon as a strategic priority, not just a commercial one.

In our AI chip market deck, we have collected signals proving this market is hot right now
Summary table of the latest funding deals in the AI chip market as of March 2026
We define the AI chip market as data-center accelerators whose primary purpose is to run AI workloads (training and inference).
We include GPUs, TPUs, and other AI accelerators/ASICs sold for deployment in servers used to train or serve machine-learning models.
We exclude general-purpose CPUs, networking and memory components, and endpoint/edge chips in phones, PCs, cars, and IoT devices.
You can also read our detailed analysis to understand how funding activity in the AI chip market has evolved over the last few years.
We also have a quarter-by-quarter analysis of funding activity in the market here.
Finally, you can check our complete list of fundraising deals for the AI chip market (we update this list every quarter).
| Name | When | Amount in $ | Round Type | Category |
|---|---|---|---|---|
| MatX | February 24, 2026 | $500M | Series B | Data-Center AI Accelerator & LLM Training + Inference |
| SambaNova Systems | February 24, 2026 | $350M | Series E | Inference Accelerator & Agentic AI Systems |
| Olix | February 11, 2026 | $220M | Funding Round | Photonic Inference Accelerator & Data-Center Inference |
| Positron | February 4, 2026 | $230M | Series B | Energy-Efficient Inference Accelerator & LLM Serving |
| Cerebras Systems | February 3, 2026 | $1,000M | Series H | Wafer-Scale AI Accelerator & Training + Inference |
| Neurophos | January 22, 2026 | $110M | Series A | Photonic AI Accelerator & Data-Center Inference |
| Fractile | January 15, 2026 | $22.5M | Early-Stage Financing | LLM Inference Accelerator & Data-Center Hardware |
| Etched.ai | January 19, 2026 | $500M | Funding Round | Transformer Inference ASIC & Data-Center Serving |
| d-Matrix | November 12, 2025 | $275M | Series C | Inference Accelerator Platform & LLM Serving |
| Rebellions | September 30, 2025 | $250M | Series C | AI Inference Chip & Data-Center Accelerator |
| Mastiska | November 27, 2025 | $10M | Seed | Sovereign AI Inference Accelerator & FPGA-Based Hardware |
| Arago | July 8, 2025 | $26M | Seed | Photonic Inference Accelerator & Data-Center Energy Efficiency |
All the latest funding deals in the AI chip market as of March 2026
MatX raised $500M in a Series B round in February 2026 to build a next-generation data-center AI accelerator.
When was it?
The deal was announced on February 24, 2026.
Who are they?
MatX is building a next-generation data-center AI accelerator chip designed to make LLM training and inference dramatically more efficient than current GPU-based solutions.
Geographical focus?
MatX is globally focused, targeting hyperscalers and AI labs, with manufacturing planned through TSMC.
Why do we include them in the AI chip market?
MatX is a pure-play data-center accelerator company, building silicon specifically for LLM training and inference workloads in server environments.
What is the company stage?
MatX is pre-product and pre-revenue, with first chip shipments targeted for 2027.
How much did they raise?
MatX raised $500M in this round.
What round is it?
This was a Series B round.
Why did they raise?
MatX raised to fund TSMC manufacturing costs and support the push toward first chip shipments while scaling the team.

In our AI chip market deck, we help you understand how the market is structured
SambaNova Systems raised $350M in a Series E round in February 2026 to scale its AI inference chip platform.
When was it?
The deal was announced on February 24, 2026.
Who are they?
SambaNova Systems designs AI accelerator chips and full-stack systems, with its SN50 platform optimized for fast inference and agentic AI workloads at data-center scale.
Geographical focus?
SambaNova is globally focused, with notable expansion into enterprise and data-center deployments including Japan via a partnership with SoftBank.
Why do we include them in the AI chip market?
SambaNova builds server-deployed AI inference accelerators, with its products sold directly to enterprises, cloud providers, and data centers running AI workloads.
What is the company stage?
SambaNova is in the growth stage, with its platform already shipping and the SN50 set to become available later in 2026.
How much did they raise?
SambaNova raised $350M in this round.
What round is it?
This was a Series E round.
Why did they raise?
SambaNova raised to expand manufacturing and cloud capacity and to scale distribution of the SN50 chip in collaboration with Intel.
Olix raised $220M in February 2026 to develop its photonic AI chip architecture for data-center inference.
When was it?
The deal was announced on February 11, 2026.
Who are they?
Olix is a UK-based startup building a photonic, inference-focused AI chip architecture designed to run large models faster and cheaper than traditional GPU-heavy stacks, without relying on HBM memory.
Geographical focus?
Olix is UK-based but is positioning itself for the global data-center inference market.
Why do we include them in the AI chip market?
Olix is designing a data-center inference accelerator ASIC, targeting AI service providers and enterprises that run large models in server environments.
What is the company stage?
Olix is at an early, pre-product stage, with first products targeted for release as soon as next year.
How much did they raise?
Olix raised $220M in this round, at a reported valuation of over $1B.
What round is it?
The round type was not clearly labeled in coverage and is reported as a general funding round.
Why did they raise?
Olix raised to accelerate the development of its inference-first chip approach as GPU bottlenecks and power constraints drive demand for alternatives.

In our AI chip market deck, we identify repeatable patterns you can use if you’re building in this market
Positron raised $230M in a Series B round in February 2026 to scale its energy-efficient AI inference accelerators.
When was it?
The deal was announced on February 4, 2026.
Who are they?
Positron builds energy-efficient data-center inference accelerators, with its Atlas product already shipping and a next-generation Asimov chip in development as a GPU alternative for LLM serving.
Geographical focus?
Positron is US-based with a strategic focus that extends to sovereign AI buildouts, as evidenced by investment from the Qatar Investment Authority.
Why do we include them in the AI chip market?
Positron builds purpose-built server inference hardware for large ML model serving, selling directly to hyperscalers and AI infrastructure operators.
What is the company stage?
Positron is at an early growth stage, with its Atlas inference systems shipping and scaling toward broader deployment.
How much did they raise?
Positron raised $230M in this round, at a valuation of over $1B.
What round is it?
This was a Series B round.
Why did they raise?
Positron raised to speed up deployment and scale, and to fund the roadmap toward its next-generation inference silicon.
Cerebras Systems raised $1B in a Series H round in February 2026, reaching a $23B valuation.
When was it?
The deal was announced on February 3, 2026.
Who are they?
Cerebras Systems builds wafer-scale AI processors and full systems designed for both training and inference at data-center scale, with deployments spanning enterprise, research, and government customers globally.
Geographical focus?
Cerebras is globally focused, with multi-continent deployments across enterprise, government, and research segments.
Why do we include them in the AI chip market?
Cerebras is a core data-center AI accelerator company, with its wafer-scale processor being its primary product for training and inference in server clusters.
What is the company stage?
Cerebras is at a late stage and scaling rapidly, making this one of the largest rounds ever raised by a pure-play AI chip company.
How much did they raise?
Cerebras raised $1B in this round, at a post-money valuation of approximately $23B.
What round is it?
This was a Series H round.
Why did they raise?
Cerebras raised to accelerate the scaling of its wafer-scale infrastructure to meet surging demand for both training and inference compute.

In our AI chip market deck, we identify pain points entrepreneurs should prioritize
Neurophos raised $110M in a Series A round in January 2026 to advance its photonic AI inference chip.
When was it?
The deal was announced on January 22, 2026.
Who are they?
Neurophos is a US-based startup developing photonic AI accelerator chips that process computations with light, with the goal of dramatically reducing the energy cost of AI inference in data centers.
Geographical focus?
Neurophos is US-based and targeting global data-center operators who face growing power constraints.
Why do we include them in the AI chip market?
Neurophos is building a data-center inference accelerator, with its photonic chip designed to replace traditional electron-based compute for AI model serving workloads.
What is the company stage?
Neurophos is in an R&D to early productization stage, with photonic compute still in the pre-volume phase.
How much did they raise?
Neurophos raised $110M in this round.
What round is it?
This was a Series A round.
Why did they raise?
Neurophos raised to push its photonic accelerator development toward deployable systems as power constraints on traditional silicon continue to tighten.
Fractile raised $22.5M in early-stage financing in January 2026 to build its LLM inference chip.
When was it?
The deal was announced on January 15, 2026.
Who are they?
Fractile is a UK-based startup designing AI inference chips and systems intended to run frontier LLM inference far more efficiently than current GPU-based stacks.
Geographical focus?
Fractile is UK-based, operating in the European AI chip ecosystem with backing from transatlantic investors including the NATO Innovation Fund.
Why do we include them in the AI chip market?
Fractile's core product is data-center LLM inference acceleration hardware, designed to serve as a more efficient alternative to GPU clusters for AI inference workloads.
What is the company stage?
Fractile is at an early, pre-product stage, using this capital to build toward its first inference chip deliverables.
How much did they raise?
Fractile raised $22.5M in this round.
What round is it?
The round is reported as early-stage financing, described variously as venture funding or a convertible note depending on the source.
Why did they raise?
Fractile raised to fund the next phase of building its inference-first silicon as inference becomes the dominant production cost in AI.

In our AI chip market deck, we will give you useful market maps and grids
Etched.ai raised $500M in January 2026 at a reported $5B valuation for its transformer-optimized inference ASIC.
When was it?
The deal was reported in mid-January 2026, with coverage published on January 19, 2026.
Who are they?
Etched.ai is building Sohu, a specialized Transformer-focused data-center inference ASIC designed to run transformer-based model serving faster and more cheaply than GPU clusters.
Geographical focus?
Etched.ai is US-based and targeting the global data-center inference market.
Why do we include them in the AI chip market?
Etched.ai is explicitly building a data-center inference accelerator ASIC for transformer workloads, with hyperscalers and inference infrastructure operators as its target customers.
What is the company stage?
Etched.ai is pre-product and pre-deployment, with the chip still in active development.
How much did they raise?
Etched.ai raised $500M in this round, at a reported valuation of approximately $5B.
What round is it?
The round is not consistently labeled in coverage and is described as a general funding round.
Why did they raise?
Etched.ai raised to fund development and commercialization of its transformer-optimized inference ASIC as the industry shifts toward inference scaling.

In our AI chip market deck, we answer all the common questions from investors and entrepreneurs
d-Matrix raised $275M in a Series C round in November 2025 to scale its AI inference accelerator platform.
When was it?
The deal was announced on November 12, 2025.
Who are they?
d-Matrix sells a full-stack inference platform, combining its Corsair accelerator hardware with software, focused on high-throughput and low-latency LLM serving for hyperscale and enterprise customers.
Geographical focus?
d-Matrix is globally focused, with multiple offices and customers spanning hyperscale, enterprise, and sovereign AI operators.
Why do we include them in the AI chip market?
d-Matrix's flagship product is explicitly a data-center inference compute platform, making it a direct fit for the AI chip market under the inference accelerator category.
What is the company stage?
d-Matrix is in a growth and scaling stage, with large-scale inference deployments already underway.
How much did they raise?
d-Matrix raised $275M in this round, at a valuation of $2B.
What round is it?
This was a Series C round.
Why did they raise?
d-Matrix raised to advance its product roadmap, expand globally, and support multiple large-scale inference deployments as inference demand accelerates.
Mastiska raised $10M in a Seed round in November 2025 to build sovereign AI inference accelerators for the UAE and GCC.
When was it?
The deal was announced on November 27, 2025.
Who are they?
Mastiska is a UAE-based startup building sovereign, data-center-class AI inference accelerators, starting with FPGA-based inference cards while developing a longer-term custom chip roadmap for GCC sovereign customers.
Geographical focus?
Mastiska is focused on the UAE and broader GCC region, specifically targeting sovereign AI infrastructure buyers and state-linked customers.
Why do we include them in the AI chip market?
Mastiska's explicit product is a data-center inference accelerator, designed for sovereign AI infrastructure deployment in server environments.
What is the company stage?
Mastiska is at a seed and early stage, building its first commercial FPGA inference cards and laying out its sovereign silicon chip roadmap.
How much did they raise?
Mastiska raised $10M in this round.
What round is it?
This was a Seed round.
Why did they raise?
Mastiska raised to fund the development of its sovereign silicon platform, launch initial FPGA inference products, and build out a UAE-based fabless chip company.

In our AI chip market deck, we have designed useful charts to give you full market clarity
Rebellions raised $250M in a Series C round in September 2025 to accelerate mass production of its AI inference chips.
When was it?
The deal was announced on September 30, 2025.
Who are they?
Rebellions is a South Korean AI chip company building energy-efficient inference accelerators using a chiplet-based architecture for high-efficiency model serving in data centers.
Geographical focus?
Rebellions is South Korea-based with global ambitions, targeting AI infrastructure operators across Asia and beyond.
Why do we include them in the AI chip market?
Rebellions is explicitly an AI inference accelerator company, building data-center-deployed inference SoCs for AI infrastructure operators.
What is the company stage?
Rebellions is in a growth and scale-up stage, using this Series C to accelerate production of its inference chip products.
How much did they raise?
Rebellions raised $250M in this round, at a valuation of $1.4B.
What round is it?
This was a Series C round.
Why did they raise?
Rebellions raised to accelerate mass production and scale its energy-efficient inference infrastructure for a growing roster of data-center customers.
Arago raised $26M in a Seed round in July 2025 to commercialize its photonic AI inference chip for data centers.
When was it?
The deal was announced on July 8, 2025.
Who are they?
Arago is a French startup with a presence in both Paris and Silicon Valley, developing a photonic AI accelerator that processes core inference math with light to dramatically cut energy use in data centers.
Geographical focus?
Arago has a France and US footprint and is targeting AI data-center operators globally as power efficiency becomes a critical bottleneck.
Why do we include them in the AI chip market?
Arago is building a data-center inference accelerator, with its photonic chip designed to replace traditional silicon for AI inference workloads in server environments.
What is the company stage?
Arago is at a seed and prototype stage, in the deep-tech R&D phase of bringing its photonic inference chip to market.
How much did they raise?
Arago raised $26M in this round.
What round is it?
This was a Seed round.
Why did they raise?
Arago raised to accelerate the commercialization of its photonic inference hardware as data-center power and efficiency become the defining bottleneck for AI scaling.
Related blog posts
- What is the latest update in the AI chip market?
- What are the latest news in the AI chip market?
- What is the real market size of the AI chip market?
- Evolution of the funding activity in the AI chip market
Who is the author of this content?
NEW MARKET PITCH TEAM
We track new markets so founders and investors can move fasterWe build living “market pitch” documents for emerging markets: from AI to synthetic biology and new proteins. Instead of digging through outdated PDFs, random blog posts, and hallucinated LLM answers, our clients get a clean, visual, always-updated view of what’s really happening. We map the key players, deals, regulations, metrics and signals that matter so you can decide faster whether a market is worth your time. Want to know more? Check out our about page.
How we created this content 🔎📝
At New Market Pitch, we kept seeing the same problem: when you look at a new market, the data is either missing, paywalled, or buried in 300-page reports that feel like they were written in the 80s. On the other side, LLMs and random blog posts give you confident answers with no sources, and sometimes they just make things up. That’s not good enough when you’re about to invest real money or launch a company.
So we decided to fix the experience. For each market we cover, we build a structured database and update it on a regular basis. We track funding rounds, fund memos, M&A moves, partnerships, new products, policy changes, and the real activity of startups and incumbents. Then we turn all of that into a clear “market pitch” that shows where the opportunities are and how people actually win in that space.
Every key data point is checked, sourced, and put back into context by our team. That’s how we can give you both speed and reliability: fast coverage of new markets, without the usual guesswork.