CHIP INDUSTRY A seismic shift in the world’s top ten chip maker rankings
Related Vendors
In March 2023 we published an article listing the world’s top ten semiconductor manufacturersi. The piece below updates that listing, showing how rankings have changed in the intervening years. These changes reveal how and why the industry landscape has shifted since early 2023.
The evolving landscape of the semiconductor industry from 2023 to 2025 highlights a significant shift propelled by advancements in AI technology. This transformation has not only reshaped the rankings of leading semiconductor manufacturers but also underscores the profound impact of generative AI since its breakthrough in late 2022.
Table 1 below presents a comparative overview of the world's top ten semiconductor companies in March 2023 and July 2025. It reveals the prominent shifts in revenue and market positions as companies adapted to or capitalized on AI innovations. Notably, Nvidia has dramatically risen through the ranks, leveraging its GPU technology to transition from a traditional chip manufacturer to a critical AI infrastructure provider, a shift largely propelled by the generative AI wave initiated by the release of ChatGPT in November 2022. Meanwhile, Samsung Electronics leads the list, having not been included in earlier evaluations but recognized for its dominant presence in the industry.
This table serves as a prelude to a deeper exploration of the specific technologies that have propelled each company to their current standings and anticipates the beginnings of a broader AI-driven revolution within the semiconductor and technology sectors.
|
| March 2023 |
| July 2025 |
| ||
| Company | Revenue USDb | Exchange | Company | Revenue | Country | |
| 1 | Samsung Electronics | 196.71 | KRX | Samsung Electronics | 219.21 | S. Korea |
| 2 | Taiwan Semiconductor Chip Manufacturing Co. Ltd. (TSMC) | 71.66 | NYSE | Nvidia Corp. | 148.51 | USA |
| 3 | Intel Corp | 69.54 | NASDAQ | Taiwan Semiconductor Chip Manufacturing Co. Ltd. (TSMC) | 88.34 | Taiwan |
| 4 | Qualcomm Inc. | 42.10 | NASDAQ | Broadcom Inc. | 57.04 | USA |
| 5 | Broadcom Inc. (AVGO) | 33.20 | NASDAQ | Intel Corp. | 53.04 | USA |
| 6 | Micron Technology Inc. | 30.76 | NYSE | SK Hynix | 50.71 | S. Korea |
| 7 | Nvidia Corp. | 28.57 | NASDAQ | Qualcomm Inc. | 42.28 | USA |
| 8 | Applied Materials Inc. | 25.79 | NASDAQ | Micron Technology | 33.81 | USA |
| 9 | ASE Technology Holding Co. Ltd. | 23.04 | NYSE | ASML Holding NV. | 33.03 | Netherlands |
| 10 | Advanced Micro Devices | 22.83 | NYSE | Applied Materials Inc. | 28.08 | USA |
| Notes: 1. Samsung didn’t appear in the original 2023 review. However, it has been added here, in its rightful place as No. 1 in the 2023 column, with USD196.71b revenue, according to Companiesmarketcap.com ii 2. The remainder of the 2023 column is from YCharts as of December 22, 2022, showing trailing twelve months’ (TTM) operations. 3. The 2025 column shows revenue from TTM operations 2024 – 5, according to Companiesmarketcap.com iii | ||||||
Read on for a closer look at each of 2025’s top ten companies, and the technology that got them there.
Samsung Electronics
Samsung’s high annual revenue cannot be seen as a straight comparison with other semiconductor manufacturers, as the company is so heavily involved with more highly integrated products, particularly TVs and mobile phones. Nevertheless, in 2023 it was listed as the world’s biggest memory chip manufacturer, as well as the second-biggest firm in the semiconductor segment in terms of market capitalizationiv.
Factors driving Samsung’s growth in that year included orders for Qualcomm's low to mid-range 5G chips, 5G modems, and 28nm OLED display drivers, plus fabrication of their own Exynos 2400 technology using a 4nm process. Samsung was the only company other than TSMC to have developed a 3nm chip manufacturing process, but it hadn't found any big client for its new process.
For Q12024, the company reported a 933 % annual increase in first-quarter operating profits, fueled by chip price recovery from a post-pandemic slump and rising demand for artificial intelligence-driven products, marking a turnaround after six consecutive quarters of profit declinev.
Technology reasons for the company’s further chip manufacturing growth from 2023 to 2025 include their ongoing development of 3mm and 2mm process nodes, aiming to rival TSMC in cutting-edge chip fabrication. Also, especially in 2024, Samsung ramped up its supply of high bandwidth memory (HBM) memory chips, crucial for AI applications.
Samsung expanded its chip production at its largest plant in 2023, defying industry trends of scaling back. Its global market share rose to 10.5 %, surpassing Intel and other competitors.
Nvidia
While Samsung’s figures are extremely impressive, Nvidia’s rise over recent years has been meteoric – with revenue growing from USD28.57b in 2023 to USD148.51 in 2025. This was enough to move Nvidia from seventh to second position in the above ranking table and dominate the tech landscape.
These results were fueled by the explosion in AI. As ChatGPT’s launch in 2022 triggered a surge in generative AI development, Nvidia was perfectly positioned to supply the computing power it needed. Their GPUs, especially the H100 and newer Blackwell architecture, are the backbone of AI model training and inference.
Additionally, their CUDA software platform created a developer ecosystem that’s nearly impossible for competitors to replicate. With over 5 million CUDA developers and 40,000 companies building on its platform, Nvidia’s tech is deeply embedded across industriesvi.
Nvidia’s chips are used by giants like Microsoft, Amazon, and Google to power cloud services and AI workloads.
Taiwan Semiconductor Chip Manufacturing Co. Ltd. (TSMC)
TSMC’s drop from second to third place is due to Nvidia’s growth rather than any reduction in their own sales. In fact, their profits surged 61 % year-over-year in Q2 2025, driven by demand for chips under 7nmvii.
As the world’s largest contract chip manufacturer, TSMC has benefited from the megatrend towards AI as it gains from producing advanced processors for clients including Apple – and Nvidia.
The company’s 2nm (N2) technology will enter mass production in late 2025, followed by A16 (1.6 nm) fabrication in late 2026viii. A16 features backside power delivery networks (BSPDNs) for improved logic density and thermal management – ideal for AI and data center chips.
N2P fabrication is also expected; this enhances N2 performance over typical N2 without adding complexities associated with a backside power delivery. This is optimal for client devices, such as system-on-chips (SoCs) for smartphones and entry-level PCs. N2X further enhances performance by adding higher voltages, which might be a benefit for a variety of applications, such as high-performance CPUs.
Further ahead, in 2028, TSMC plans to launch 1.4 nm technology, using second-gen gate-all-around (GAA) transistors.
AI
The impact of artificial intelligence on the semiconductor industry
Broadcom inc.
GPUs dominate the conversation when it comes to AI infrastructure - but it's the interconnect fabrics that allow them to train and run multi-trillion-parameter models at scaleix.
These interconnects span multiple domains, from die-to-die communications on the package itself, through the chips in a system, to the system-to-system networks that allow scaling to hundreds of thousands of accelerators.
Developing and integrating these interconnects is highly challenging. It's arguably the reason Nvidia is the powerhouse it is today. However, over the past few years, Broadcom has been developing technologies that range from scale-out Ethernet fabrics all the way down to the package itself.
Broadcom’s hardware implementation of complex interconnect systems is based on its high radix (high density, versatility, and performance) switching products. Companies like Meta, xAI, Oracle, and others deploy high numbers of GPUs, which call for similarly large numbers of switches for their interconnection. For example, a cluster of 128,000 accelerators might need 5,000 or more switches just for the compute fabric, and yet more may be required for storage, management, or API access.
To address this demand, Broadcom has introduced some extremely high radix switches, initially with its 51.2Tbps Tomahawk 5 chips in 2022, and more recently, the 102.4Tbps Tomahawk 6 (TH6), available with either 1,024 100Gbps SerDes or 512 200Gbps SerDes Serializer/Deserializers.
The company is also pioneering Scale-Up Ethernet (SUE) to rival Nvidia’s NVLink, allowing rack-scale AI systems to scale efficiently. And, alongside conventional Ethernet switching, their co-packaged optics (CPO) tech boosts bandwidth while slashing power consumption—3.5x more efficient than traditional pluggables. CPO takes the lasers, digital signal processors, and retimers normally found in pluggable transceivers and moves them onto the same package as the switch ASIC.
Unlike Nvidia, Broadcom deals in merchant silicon. Its chips and intellectual property are available to any buyer, although many of their implementations are not widely known. For example, Google's Tensor Processing Unit (TPU)+++ probably made extensive use of Broadcom IP. Apple is also rumored to be developing server chips for AI using Broadcom designs.
Intel Corporation
The once iconic semiconductor manufacturer’s revenue decline over the past few years has been driven by a mix of internal challenges and external pressures. For example, their foundry business lost USD13b in 2024 alone, as it ramped up next-gen chip production using extreme ultraviolet lithography (EUV) tools.
Additionally, while rivals like Nvidia and AMD capitalized on the generative AI surge, Intel lagged behind due to its limited presence in AI accelerators. Meanwhile, PC chip sales declined over 6 % in Q3 2024, as demand for traditional CPUs softened and AI-powered PCs failed to take offx.
The company’s turnaround hinges on its upcoming 18A process node (to be used first in their Panther Lake CPU), and AI accelerators like Gaudi 3. Intel is focusing on on-device AI and agentic (Smart assistant) AI that acts autonomously on PCs and edge devices. They are also enhancing NPUs (neural processing units) in upcoming chips like Arrow Lake Refresh, aiming to support Copilot+ features.
Further ahead, with ongoing investment in agentic AI, edge computing, and AI PCs with real-time inference capabilities, Intel is expecting to introduce their Nova Lake CPU in 2026 – 7. This promises up to 50 cores, 60 % IPC boost, and advanced AI featuresxi.
The company is seeking to regain competitiveness in both foundry and product segments.
SK Hynix
SK Hynix may not be a household name, but it recently overtook Samsung as the world’s largest memory chip maker – an historic shift after four decades. This South Korean manufacturer now commands 70 % of the global High-Bandwidth Memory (HBM) market; HBM is a critical component for training and deploying next-generation AI modelsxii.
For example, Nvidia’s Blackwell Ultra is hosting their HBM3E chips, which deliver up to 2.4TB/s bandwidth, a 50 % performance boost over previous generations. These chips are also essential for cloud giants like Amazon and Google and Google to scale their AI capabilities.
High-bandwidth memory (HBM) is the backbone of modern AI computing. Unlike traditional DRAM, HBM stacks multiple layers of memory in a 3D configuration, enabling unprecedented data transfer speeds and energy efficiency. This makes it indispensable for training large language models and running complex AI workloads. SK Hynix's dominance in HBM stems from its early and aggressive investments in proprietary technologies, including its Mass Reflow-Molded Underfill (MR-MUF) packaging process. This innovation has solved long-standing yield and thermal challenges in HBM production, allowing SK Hynix to outpace rivals like Samsung and Micron.
SK Hynix’ roadmap includes the HBM4 launch in 2025, which will feature customizable logic dies for application-specific integrated circuits (ASICs), catering to the unique needs of cloud providers and AI startups. This flexibility is critical as enterprises seek to optimize AI workloads for cost and performance.
Qualcomm Inc.
Qualcomm’s growth in recent years has been fueled by a strategic move towards on-device AI, automotive platforms, and edge computing. For example, their Snapdragon X Series powers AI PCs with real-time inference, enabling features like Copilot+ and local language models. Meanwhile, their NPUs (neural processing units) are optimized for low-latency, energy-efficient AI, making them ideal for smartphones, wearables, and industrial devicesxiii.
Snapdragon chipsets now support small language models directly on devices, reducing reliance on cloud infrastructure.
Qualcomm offers technologies optimized for various vertical markets. For automotive applications, their Snapdragon Ride Flex SoC integrates ADAS (advanced driver-assistance systems) and in-cabin experiences. And their AI Hub and Dragonwing Intelligent Video Suite support real-time analytics in logistics, manufacturing, and smart cities.
In other areas, Qualcomm’s Snapdragon XR platforms power over 100 immersive devices globally, including Meta’s Ray-Ban smart glasses, while Snapdragon 8 Elite powers flagship Android phones like Samsung’s Galaxy S25, driving demand for premium-tier mobile experiences.
Micron Technology
Micron Technology's Dynamic Access Random Memory (DRAM) revenues grew 51 % year over year in the third quarter of fiscal 2025 to USD7.1 billion. The DRAM segment, which accounted for 76 % of the company’s top line, is being propelled by traction in data center, automotive, PC and mobile end market growthxiv.
The company’s DRAM lineup is led by products including low-power server DRAM and high bandwidth memory (HBM) chips, which are experiencing a massive surge in AI applications due to their higher capacity and low power consumption.
Micron highlighted robust momentum in the automotive end market, where advanced driver assistance systems and AI-powered in-vehicle infotainment systems are pushing higher memory and storage content growth. In the mobile end market, it is benefiting from the growing demand for AI features; for example, driving higher DRAM content growth for smartphones.
Micron’s technology roadmap is tightly focused on AI-optimized memory, high-capacity DRAM, and next-gen packaging. Their intention relates to scaling memory for AI, from hyperscale data centers to edge devices. For example, MRDIMMs with 256GB+ capacity and 12800 MT/s speeds are expected by 2026xv.
ASML Holding NV
ASML’s growth over the past few years has been powered by its dominance in advanced lithography systems, which are essential for manufacturing cutting-edge semiconductors essential for enabling the AI boom. The company’s dominant position is sustained by its pioneering, highly proprietary technologyxvi.
Specifically, ASML is the sole supplier of EUV machines globally, used to etch ultra-fine patterns on chips at 5nm and below. These systems use ultra-short wavelength light to etch circuit patterns, enabling chips that are smaller, faster, and more efficient. EUV is an important technology for extending Moore’s Law, which is the long-standing industry trend of doubling transistor density approximately every two years, allowing chipmakers to continue pushing the limits of computing power and cost-effectiveness.
ASML’s latest NXE:3800E and High-NA EUV systems enable even smaller nodes like 2nm and 1.6nm, critical for AI and HPC chips.
The company’s High-NA EUV systems use a 0.55 numerical aperture, improving resolution by 70 % over standard EUV. Their machines, which cost up to USD400 million each, are being adopted by Intel, TSMC, and Samsung for next-gen chip production. ASML’s tools are used in the production of Nvidia’s Blackwell GPUs, Apple’s M-series chips, and AMD’s MI300X accelerators.
ARTIFICIAL INTELLIGENCE CHIPS
A brief overview of AI chips vs traditional chips
Applied Materials Inc.
Applied Materials has thrived over the past few years by continuous investment in materials engineering innovations that power the semiconductor industry’s most advanced chips. It is the market leader in gate-all-around (GAA) transistor equipment, which replaced FinFETs in leading-edge logic nodes like 3nm and 2nm. Its tools enable backside power delivery, a key innovation for AI chips that improves power efficiency and logic densityxvii.
The AI boom has driven demand for energy-efficient computing, and Applied Materials’ systems are critical for manufacturing chips like Nvidia’s Blackwell and AMD’s MI300X. Their materials-driven innovation is helping customers achieve a 10,000X improvement in computing performance-per-watt over 15 years.
Applied Materials also has expertise in advanced packaging and hybrid bonding solutions. They have applied co-developed die-based hybrid bonding solutions with BE Semiconductor Industries, enabling faster interconnects and better yields. Their EPIC Center accelerates customer R&D cycles for packaging and transistor innovationsxviii.
Additionally, the company offers integrated equipment solutions. Their co-optimized and integrated platforms combine etch, deposition, and metrology tools to reduce time-to-market and improve yields.
These platforms are essential for DRAM and HBM production, where Applied Materials gained 10 points of market share over the last decade.
Overall, Applied Materials’ strategy is all about co-innovation, materials intensity, and energy-efficient computing.
We’re just at the start of the AI revolution
We have seen how AI has driven Nvidia’s explosive growth, as it became the first company ever to reach USD4 trillion in market value, a milestone that took just over two years , from USD1 trillionxix.
However, the above reviews also show how AI growth has rippled out to the entire semiconductor manufacturing ecosystem. In addition to the GPUs and development environments, companies across the supply chain delivering memory chips, CPU chips, on-chip AI, and technologies that enable smaller-geometry chips, such as interconnect fabrics, advanced lithography systems, advanced packaging and hybrid bonding, backside power delivery networks, and gate-all-around (GAA) transistor equipment are all benefiting from the boom.
Yet this is just the beginning for AI driven technology. Apart from building AI infrastructures to replace traditional IT installations, the real transformation will be driven by Agentic AI - autonomous AI systems that work like digital employees, handling entire workflows from start to finish.
While ChatGPT responds to prompts, an AI agent identifies problems, researches solutions, executes multistep plans and adapts its approach based on results, all while users focus on other priorities. It's the difference between a helpful assistant and a capable colleague.
The next logical step – and possibly biggest opportunity – may be in physical AI; embodied intelligence that thinks and acts, where robots and automated systems understand and manipulate the real world.
References
- i The world's largest semiconductor companies
- ii Top publicly traded semiconductor companies by revenue
- iii Top publicly traded semiconductor companies by revenue
- iv Samsung's semiconductor chip market share and revenue improved in Q3 2023 - SamMobile
- v Samsung Profit Soars 10-Fold As Chip Demand Rebounds Amid AI Boom
- vi Nvidia Just Hit $4 Trillion, But The Real AI Boom Hasn’t Started Yet
- vii TSMC profit surges 61 % to record high fueled by AI chip demand
- viii TSMC's says 1.6nm node to be production ready in late 2026 — roadmap remains on track | Tom's Hardware
- ix How Broadcom is quietly invading AI infrastructure • The Register
- x Intel's biggest revenue decline in five quarters to hit amid broad layoffs and missed | Ctech
- xi Intel CEO reportedly admits 'it is too late for us' to catch AI leaders like Nvidia, but here's how it could still recover
- xii SK Hynix's Strategic Position in the AI Memory Boom and Its Implications for Long-Term Growth
- xiii Qualcomm Q1 FY 2025 Reflect Record QCT Revenue, AI Adoption - The Futurum Group
- xiv DRAM Demand Powers Micron Technology's Growth: Will the Momentum Last?
- xv Micron Plans HBM4E in 2028, 256GB DDR5-12800 RAM Sticks in 2026 | Tom's Hardware
- xvi Buy, Sell Or Hold ASML Stock?
- xvii RESEARCH NOTE: Applied Materials 2024 Results Showcase Continued Momentum in AI and Beyond - Moor Insights & Strategy
- xviii Applied Materials' R&D Investment is Climbing: Can it Deliver Results?
- xix Nvidia Just Hit $4 Trillion, But The Real AI Boom Hasn’t Started Yet
(ID:50546835)