What is the real market size of the AI chip market?
Download our beautiful pitch about the AI chip market

In our AI chip market deck, you will find everything you need to understand the market
The AI chip market is growing faster than almost any semiconductor category in history.
Data center operators are spending hundreds of billions on accelerators to power AI training and inference.
And if you want to better understand this new industry, you can download our pitch covering the AI chip market.
Insights
- The AI chip market reached roughly $123 billion in 2024 and will grow to approximately $240 billion in 2026, driven by massive hyperscaler investments in training and inference infrastructure.
- Custom ASICs from hyperscalers are capturing market share, but merchant GPUs still dominate with 75% revenue share in 2026, given their flexible deployment and strong software ecosystem advantages.
- Growth rates are peaking in 2026 before moderating, with market expansion shifting from explosive training buildouts to sustained inference deployments across enterprise and consumer applications.
- A single quarter of GPU and accelerator sales hit $54 billion in Q2 2024, showing the extreme velocity of AI infrastructure spending by major cloud providers.
- Power and cooling constraints will increasingly limit deployment scale, making efficiency improvements and workload optimization critical for continued market growth through 2036.
- North America currently captures 50% of global AI chip deployments, but Asia's share will rise to 40% by 2036 as sovereign AI infrastructure expands across China, Japan, and Southeast Asia.
- AI servers now represent over 70% of total server industry value, fundamentally reshaping data center economics and accelerating the transition to AI-first infrastructure design.
- Intel's Gaudi accelerator missed its $500 million revenue target in 2024, highlighting how late entrants face massive software ecosystem and adoption barriers in this winner-take-most market.
- Broadcom's custom AI accelerator revenue jumped to $5.2 billion quarterly, demonstrating how hyperscaler-specific silicon is becoming a major revenue stream separate from merchant GPU sales.
- The market will reach approximately $747 billion by 2036 under realistic growth scenarios, assuming inference workloads drive sustained demand and supply chain bottlenecks resolve successfully.
How do we define the AI chip market?
We define the AI chip market as data-center accelerators whose primary purpose is to run AI workloads for training and inference.
We include GPUs, TPUs, and other AI accelerators or ASICs sold for deployment in servers used to train or serve machine-learning models.
We exclude general-purpose CPUs, networking and memory components, and endpoint or edge chips in phones, PCs, cars, and IoT devices.
We also use this definition when we make and update our pitch covering everything there is to know about the AI chip market

In our AI chip market deck, we will give you useful market maps and grids
What is the size of the AI chip market in 2026?
What results can we find on the internet?
As you probably know already, many firms regularly publish sometimes conflicting estimates of the AI chip market size, using different definitions, scopes, and years.
We have consolidated their results here. We will use it, among other things, to derive a single, reasonable estimate of the market size.
| Research Firm | Market Size | Year | Market Definition & Fit |
|---|---|---|---|
| Omdia | $207B | 2025 | AI data center chip market including GPUs and AI accelerators. Slightly broader than our definition as it may include supporting silicon components. |
| Omdia | $123B | 2024 | Same AI data center chip definition as above. Very close to our scope but still slightly broader in component coverage. |
| MarketsandMarkets | $170.81B | 2025 | Data center accelerator market including GPU, CPU, ASIC, and FPGA. Broader than our definition because it includes CPU accelerators and non-AI use cases. |
| Grand View Research | $17.67B | 2024 | Data center accelerator market estimate. Much narrower than our definition and inconsistent with current AI spending levels. |
| Global Market Insights | $8.1B | 2023 | Data center accelerator market. Too narrow compared to reality and likely excludes most AI GPU revenue. |
| Dell'Oro Group | $54B (quarter) | Q2 2024 | Quarterly revenues for GPUs and accelerators including custom chips. Close to our definition but time-sliced to one quarter only. |
| Futurum Group | $32.6B (quarter) | Q4 2024 | AI processors and accelerators in data centers for one quarter. Close to our scope but may include some non-accelerator AI processors. |
| BIS Research | $36.0B | 2024 | Data center GPUs only. Narrower than our definition because it misses TPUs and custom ASICs. |
| Precedence Research | $16.94B | 2024 | Data center GPU market estimate. Narrower than our definition and seems low versus vendor revenues. |
| Emergen Research | $10.5B | 2024 | Data center AI chips market. Far too narrow and likely excludes most AI GPU revenue. |
What can we conclude, then?
Omdia provides the most reliable estimate at $207 billion for 2025, matching our definition of AI data center accelerators for training and inference workloads.
For 2026, we estimate approximately $240 billion based on a moderate 16% growth rate, reflecting Omdia's observation that spending share peaks this year before growth moderates. This is our first estimate, and we will refine it further using bottom-up calculations.

In our AI chip market deck, we have collected signals proving this market is hot right now
What if we try to make our own estimate?
We don't have to rely only on external analyses to estimate market size.
We will try to build a first-principles, bottom-up calculation, then run a few sanity checks to see whether we can reliably estimate the size of the AI chip market.
Useful data about the AI chip market
Here is some useful and reliable data we have collected, and they will help us estimate the size of the AI chip market:
- Omdia estimates AI data center chip market reached $123 billion in 2024 (Omdia)
- Omdia projects AI data center chip market will reach $207 billion in 2025 (Omdia)
- Dell'Oro reported GPU and accelerator sales hit $54 billion in Q2 2024 alone (Dell'Oro Group)
- Futurum measured data center AI processors and accelerators at $32.6 billion in Q4 2024 (Futurum Group)
- TrendForce estimates AI servers represented approximately $205 billion of the $306 billion server industry in 2024 (TrendForce)
- TrendForce projects AI server value will reach approximately $298 billion in 2025 (TrendForce)
- IDC reports accelerated servers account for 70% of AI infrastructure spending in first half 2024 (IDC)
- AMD generated $12.6 billion in data center segment revenue during 2024, driven by Instinct and EPYC (AMD)
- AMD's Instinct accelerator revenue reached approximately $5 billion in 2024 according to reports (Data Center Dynamics)
- NVIDIA's data center revenue hit $51.2 billion in Q3 fiscal 2026 alone (NVIDIA Investor Relations)
- Broadcom reported AI revenue of $5.2 billion in Q3 fiscal 2025 from custom accelerators (Broadcom)
- Dell'Oro estimates US hyperscalers deployed 5 million AI training-capable accelerators during 2024 (Dell'Oro Group)
Method and calculation to get the size of the AI chip market
We start with TrendForce's estimate that AI servers will reach $298 billion in 2025.
In modern AI servers, accelerator chips typically represent the largest cost component. For training servers with multiple high-end GPUs, accelerators dominate the bill of materials.
A conservative estimate across both training and inference configurations suggests accelerators represent roughly 70% of AI server value. This aligns with IDC's observation that accelerated configurations dominate AI infrastructure spending.
Applying this ratio to 2025 AI server value gives us approximately $209 billion for accelerator chips. This matches almost perfectly with Omdia's independent estimate of $207 billion for 2025.
For 2026, Omdia indicates spending share peaks this year before moderating. This suggests continued growth but at a slower pace than 2023 through 2025.
We apply a 15% step-up from 2025 to 2026, reflecting ongoing hyperscaler buildouts but acknowledging the slowdown. This yields approximately $238 billion, which we round to $240 billion for the AI chip market in 2026.
Sanity checks
Let's verify this estimate makes sense (we always double-check everything, as you will see in our pitch deck covering the AI chip market).
Dell'Oro reported $54 billion in GPU and accelerator sales in just Q2 2024. If we annualize this single quarter, we get roughly $216 billion as a 2024 run rate, making $240 billion for 2026 entirely plausible.
Looking at vendor revenues, AMD generated approximately $5 billion in accelerator revenue while NVIDIA's data center segment is far larger. A total market in the hundreds of billions fits the vendor landscape and massive hyperscaler deployment plans.
Dell'Oro estimates US hyperscalers deployed 5 million training-capable accelerators in 2024. Even at a conservative blended price across training and inference chips, this deployment scale supports a market measured in hundreds of billions.
What's our final guess then?
Based on all the evidence, we estimate the AI chip market will reach approximately $240 billion in 2026.
This represents about 16% growth from the $207 billion level in 2025. The AI chip market in 2026 is comparable to the global automotive semiconductor market, which stands at roughly $250 billion.
Our estimate aligns with both top-down analysis from AI server economics and bottom-up validation from vendor revenues and deployment data. The convergence of multiple independent approaches gives us confidence in this figure.
The AI chip market in 2026 exceeds the entire global memory chip market of just a few years ago. This reflects the unprecedented scale of infrastructure investment by hyperscalers and enterprise customers.
Growth is moderating from the explosive 2023 to 2024 period, but $240 billion still represents massive year-over-year expansion. The AI chip market remains one of the fastest-growing segments in semiconductors.

In our AI chip market deck, we provide the data and the context to understand it
Is the AI chip market mature, competitive, fragmented?
The maturity score of the AI chip market in 2026 is 55/100
The AI chip market in 2026 shows moderate maturity with massive demand but rapidly changing product cycles. Interconnect standards, HBM memory generations, and advanced packaging techniques evolve every 12 to 18 months.
Customer requirements continue shifting between merchant GPUs and custom ASICs, creating uncertainty for suppliers. Omdia expects custom ASICs to gain market share, signaling the buying landscape is still in flux rather than settled.
The competitiveness score of the AI chip market in 2026 is 80/100
The AI chip market in 2026 is intensely competitive at the high end across performance, power efficiency, networking, and software ecosystems. Hyperscalers push custom silicon development, with Broadcom's custom accelerator revenue growing rapidly to over $5 billion quarterly.
Competition extends beyond chip specifications to total cost of ownership, deployment speed, and developer experience. Intel's Gaudi struggles illustrate how software maturity and ecosystem support determine competitive outcomes as much as raw hardware performance.
The fragmentation score of the AI chip market in 2026 is 35/100
The AI chip market in 2026 shows low fragmentation with revenue concentrated among NVIDIA and hyperscaler ASIC ecosystems. Many challengers exist, but few have achieved significant shipped revenue at scale.
Intel's Gaudi missing its $500 million target in 2024 demonstrates the gap between established leaders and new entrants. Market concentration reflects winner-take-most dynamics driven by software lock-in and massive R&D requirements for competitive products.
How much bigger will the AI chip market be in 10 years?
What are the different forecasts for the growth rate of the AI chip market?
One more time, let's check what other market research firms have to say.
| Research Firm | Growth Rate | Until Year | Commentary and Adjustments |
|---|---|---|---|
| MarketsandMarkets | 16.9% CAGR | 2030 | Data center accelerator market including CPU and non-AI workloads. Broader than our definition, so we treat this as a lower benchmark. The scope dilutes AI-specific growth with slower-growing segments. |
| Grand View Research | 24.7% CAGR | 2030 | Data center accelerator market with likely narrow baseline definition. Use directionally for growth trends but not absolute levels. Their starting market size appears inconsistent with current spending. |
| Omdia | 6.7% CAGR | 2030 | AI data center chip market from $207B in 2025 to $286B in 2030. Captures post-boom slowdown explicitly, making it highly relevant for 2026 and beyond planning. Their conservative view reflects market maturation. |
| Precedence Research | 27.52% CAGR | 2034 | Data center GPU market only, excluding custom ASICs and TPUs. Use as signal that GPU segment could outgrow total market. Merchant GPU demand may stay strong despite custom chip growth. |
| BIS Research | 23.33% CAGR | 2034 | Data center GPUs only, useful for merchant GPU scenario planning. Higher growth reflects continued strong demand for flexible, software-mature GPU platforms. Does not account for custom ASIC substitution. |
| Global Market Insights | 25% CAGR | 2032 | Broader accelerator definition including multiple chip types and use cases. Treat as upper band for early years when AI drives most growth. Later years likely see growth moderate as market matures. |
| Future Market Insights | 25.0% CAGR | 2035 | Likely includes multiple accelerator types beyond pure AI workloads. Use cautiously as baseline likely differs from our definition. Long forecast horizon increases uncertainty around sustained growth assumptions. |
What can we conclude about the growth rate of the AI chip market?
Based on the evidence, we estimate the AI chip market will grow at approximately 12% CAGR from 2026 through 2036.
This falls between Omdia's conservative 6.7% post-boom estimate and the 20% to 25% rates that likely extrapolate peak growth too far forward. Our 12% rate assumes inference expansion drives sustained demand even as training buildouts moderate.
The AI chip market in 2026 starts at $240 billion and should reach approximately $377 billion by 2030, representing a 1.57 times multiple. By 2036, the market could grow to roughly $747 billion, or 3.11 times the 2026 base.
This growth trajectory resembles platform shift markets like cloud infrastructure buildout, but with constraints from power availability and capital expenditure limits. The AI chip market growth rate exceeds mature semiconductor segments but stays below the unsustainable boom-era pace.
And if you're curious about what's happening in this really interesting market, we publish a quarterly update on the activity in the AI chip market here. We also have a monthly update here.

In our AI chip market deck, we dentify risks investors and builders need to be aware of
What is the projected CAGR for the AI chip market?
At New Market Pitch, we like it when the information is clear and easy to digest, as you will see in the pitch about the AI chip market. That's also why we have made this clear summary table.
| Year | Worst Case (5% annual growth) | Realistic (12% annual growth) | Best Case (18% annual growth) |
|---|---|---|---|
| 2027 | $252B | $269B | $283B |
| 2028 | $265B | $301B | $334B |
| 2029 | $278B | $337B | $394B |
| 2030 | $292B | $377B | $465B |
| 2031 | $307B | $423B | $549B |
| 2032 | $322B | $473B | $648B |
| 2033 | $338B | $530B | $764B |
| 2034 | $355B | $594B | $902B |
| 2035 | $373B | $665B | $1,064B |
| 2036 | $392B | $747B | $1,255B |
What would it take for the AI chip market to be worth $1.3 trillion?
Reaching $1.3 trillion in the AI chip market by 2036 requires inference workloads to become the dominant spending driver, not just periodic training cluster refreshes. Enterprises must deploy AI features across billions of users, creating sustained accelerator demand that grows faster than efficiency improvements reduce costs.
Power infrastructure needs major breakthroughs in grid capacity, on-site generation, and advanced cooling technologies. Without solving power constraints, hyperscalers cannot deploy enough accelerators to reach trillion-dollar market scale regardless of chip availability.
Supply chain bottlenecks in HBM memory, advanced packaging, and substrate production must resolve smoothly. The AI chip market cannot triple without proportional scaling in these supporting technologies that currently limit production capacity.
Custom ASICs must expand total AI chip market usage instead of merely lowering costs per operation. This happens when cheaper inference enables new applications that would be economically impossible with current merchant GPU pricing.
AI adoption must broaden globally beyond US hyperscalers into sovereign cloud infrastructure, enterprise on-premises clusters, and emerging market deployments. Asia's AI chip market share needs to expand aggressively as China, Japan, and Southeast Asian countries build domestic AI capabilities.
Regulatory environments must support continued large-scale AI infrastructure investment without imposing capacity constraints. Export controls, data sovereignty requirements, or environmental restrictions could fragment the AI chip market and slow aggregate growth.
Merchant GPU platforms need sustained competitive intensity to prevent monopoly pricing that would cap total market expansion. Multiple viable vendors competing on price and performance keeps AI chip market growth maximized through volume rather than margin.

In our AI chip market deck, we answer all the common questions from investors and entrepreneurs
Where is the money in the AI chip market?
What are the categories and how much do they generate?
Merchant GPUs from companies like NVIDIA and AMD dominate the AI chip market in 2026, capturing approximately 75% of revenue. These chips offer the fastest deployment timelines and most mature software ecosystems, making them the default choice for most buyers.
Hyperscaler TPUs and internal ASICs represent roughly 20% of the AI chip market in 2026. Companies like Google, Amazon, and Meta deploy these custom chips at massive scale within their own infrastructure for cost optimization and supply control.
Other inference ASICs and smaller vendors account for the remaining 5% of AI chip market revenue in 2026. This category includes specialized chips from startups and established players trying to capture inference workload share, but revenue remains small despite technical progress.
The 75-20-5 split in the AI chip market reflects the trade-off between deployment flexibility and long-term optimization. Merchant GPUs win on speed and ecosystem support, while custom ASICs require years of development but deliver better economics at hyperscale.
How will it evolve?
Custom silicon grows steadily as hyperscalers optimize costs and control supply chains. By 2030, merchant GPUs will represent approximately 65% of AI chip market revenue, hyperscaler ASICs will reach 30%, and other vendors will hold 5%.
By 2036, merchant GPUs decline to roughly 55% of the AI chip market, hyperscaler ASICs expand to 40%, and specialized inference chips remain at 5%. Custom chips gain share but merchant platforms remain dominant because they serve the long tail of AI deployments.
Where to spend your energy as an investor or a builder in the AI chip market then?
Builders should focus on inference cost reduction through tooling, optimized kernels, efficient compilers, and improved serving stacks. These software layers multiply the value of existing AI chip market hardware rather than requiring new silicon development.
Multi-vendor portability tools that help workloads run efficiently across different GPUs and ASICs address a major pain point. The AI chip market is fragmenting between platforms, creating demand for abstraction layers that preserve flexibility.
Investors should target supply chain bottlenecks in advanced packaging, HBM memory ecosystems, and specialized substrates. The AI chip market growth depends on these enabling technologies scaling faster than current capacity allows.
Networking infrastructure tied directly to AI accelerators represents significant investor opportunity. High-speed fabrics, specialized switches, and optical interconnects are essential for the AI chip market clusters but often overlooked compared to chips themselves.
And if you're curious about where investors are putting their money right now, we publish a quarterly update on the fundraising activity in the AI chip market here. We also analyze long-term funding trends in the AI chip market here.

In our AI chip market deck, we track adoption trends and shifts in consumer behavior
What is the geographical revenue breakdown for the AI chip market?
North America
North America captures approximately 50% of AI chip market revenue in 2026, driven by massive hyperscaler deployments from Amazon, Microsoft, Google, and Meta. This share declines to roughly 45% by 2030 and 40% by 2036 as other regions build AI infrastructure capacity.
The decline reflects AI adoption spreading globally rather than North American demand shrinking. The AI chip market in North America continues growing in absolute terms, but Asia and Europe accelerate faster from lower bases.
Asia
Asia represents about 30% of AI chip market revenue in 2026, including significant deployments in China despite export restrictions. This share expands to approximately 35% by 2030 and 40% by 2036 as sovereign AI capabilities develop.
China, Japan, South Korea, and Southeast Asian countries are investing heavily in domestic AI infrastructure. The AI chip market growth in Asia accelerates as these nations prioritize technological independence and local compute capacity.
Europe
Europe accounts for roughly 15% of AI chip market revenue in 2026 and expands slightly to 17% by both 2030 and 2036. European AI infrastructure investment lags behind North America and Asia but maintains steady growth.
Regulatory frameworks like the EU AI Act shape deployment patterns in European markets. The AI chip market in Europe grows but faces headwinds from stricter data sovereignty requirements and slower hyperscaler expansion compared to other regions.
Rest of World
The rest of world captures approximately 5% of AI chip market revenue in 2026, declining to 3% by both 2030 and 2036. Limited power infrastructure and capital availability constrain AI deployments in emerging markets outside major Asian economies.
Some regions like Latin America and Africa see AI adoption primarily through cloud services rather than local chip deployments. The AI chip market concentration in developed regions persists through 2036 despite global AI application growth.

In our AI chip market deck, we have designed useful charts to give you full market clarity
Related blog posts
- What are the latest news in the AI chip market?
- What are the latest funding news in the AI chip market?
- What is the latest update in the AI chip market?
Who is the author of this content?
NEW MARKET PITCH TEAM
We track new markets so founders and investors can move fasterWe build living “market pitch” documents for emerging markets: from AI to synthetic biology and new proteins. Instead of digging through outdated PDFs, random blog posts, and hallucinated LLM answers, our clients get a clean, visual, always-updated view of what’s really happening. We map the key players, deals, regulations, metrics and signals that matter so you can decide faster whether a market is worth your time. Want to know more? Check out our about page.
How we created this content 🔎📝
At New Market Pitch, we kept seeing the same problem: when you look at a new market, the data is either missing, paywalled, or buried in 300-page reports that feel like they were written in the 80s. On the other side, LLMs and random blog posts give you confident answers with no sources, and sometimes they just make things up. That’s not good enough when you’re about to invest real money or launch a company.
So we decided to fix the experience. For each market we cover, we build a structured database and update it on a regular basis. We track funding rounds, fund memos, M&A moves, partnerships, new products, policy changes, and the real activity of startups and incumbents. Then we turn all of that into a clear “market pitch” that shows where the opportunities are and how people actually win in that space.
Every key data point is checked, sourced, and put back into context by our team. That’s how we can give you both speed and reliability: fast coverage of new markets, without the usual guesswork.