Artificial intelligence really is a paradigm-breaking, transformative technology. Right now, investors are so enthusiastic about the sector, especially the obvious leader Nvidia (NVDA), that we’re looking at a potential bubble that will collapse with much gnashing of teeth and I-told-you-so “wisdom” casting doubt on the reality of the entire endeavor.
I think a bubble is indeed possible. Nvidia did trade at a trailing twelve-month price-to-earnings ratio of 196 on May 31, after all. But I think you do want to own the sector now–because the breaking of the bubble, if it does break is, in my opinion, two quarters or more away. And you want to own the sector for the long run–say, 10 years or more–because it is such a game changer for so much of the economy.
But what to own?
I’ve put together a list of the 10 stocks that I think are the best way to participate in the AI gold rush.
In the first part of this post, I’m simply going to list the 10 stocks I recommend.
And then in the second part, I’m going to tell you why each of these stocks deserves a place on this list and in your portfolio.
So here’s Part 1, The List, of my “10 Stocks for the AI Gold Rush.”
I’ve divided the list into sections that reflect each company’s place in the AI universe. Remember you can get more details on “why” on my free site JubakPicks.com.
Let’s start with the Big Kahuna, Nvidia (NVDA), the chip company that’s at the core of all things AI at the moment.
And then I’ll add other chip stocks to this list:
Advanced Micro Devices (AMD)
Micron Technology (MU)
Two stocks from companies that will provide the machines and foundries to make the very complex chips required to do AI processing:
Applied Materials (AMAT)
Taiwan Semiconductor Manufacturing (TSM)
And then these two stocks from companies that straddle the line between chip stocks (they’re in the process of developing their own), hosting of AI servers to provide the massive amounts of processing that AI requires, and applications that then themselves use AI:
And then, finally, a selection of companies that will profit from using AI to improve existing products or invent entirely new ways to use massive amounts of data:
CVS Health (CVS)
Emerson Electric (EMR)
Part 2: The “WHYs”
Nvidia (NVDA): If any stock deserves to be at the core of an AI portfolio, it’s Nvidia. Yeah, I know it’s incredibly expensive right now and I wouldn’t be surprised to see the price plunge sometime in 2024 when the company signals a bad quarter–you know, one where year-over-year growth is just 50% instead of the 100%+ expected for the rest of 2023–but the company is uniquely positioned to rise the key trend in AI. What’s that? ell, every advance in the applications of artificial intelligence–in something like ChatGPT or autonomous (or near autonomous) vehicle navigation or individual genomic screening for tailoring drugs to individual patients–fuels the hunger for more and faster processing of more data. (For example, In March Microsoft reported that it had to string together tens of thousands of Nvidia’s A100 GPUs in data centers in order to handle the computational workload in the cloud for OpenAI, ChatGPT’s developer.) Nvidia’s product roadmap for the quarters ahead–and probably for the years ahead–shows that, for the foreseeable future, this is the one company able to satisfy that hunger. (Not that there aren’t competitors visible on the horizon. But they are on the horizon. The most formidable, in my opinion, are Microsoft and Alphabet. More on this later.) For example, in March Nvidia launched two new chips, one focused on enhancing AI video performance (Adobe and Shutterstock have already said they will use the new Nvidia capabilities) and the other an upgrade to the company’s workhorse H100 chip. This last GPU chip, the H100 NVL, is designed specifically to improve the deployment of large language models like those used by ChatGPT. It performs 12 times faster when handling inferences—that is, how AI responds to real-life queries—-than the prior generation of A100 chips used in data centers. An overlooked announcement at the March Nvidia developers conference was a new program called DGX Cloud, hosted by Oracle and soon Microsoft Azure and Google Cloud, that would let companies “rent” AI processing power. The goal, the company said, is to make accessing an AI supercomputer as easy as opening a webpage, enabling companies to train their models without the need for on-premise infrastructure that’s costly to install and manage. “Provide your job, point to your data set, and you hit go-—and all of the orchestration and everything underneath is taken care of,” said Manuvir Das, Nvidia’s vice president of enterprise computing. The DGX Cloud service will start at $36,999 per instance per month, with each “instance”—-essentially the amount of computing power being rented—-equal to eight H100 GPUs. Of course, what the company didn’t say is that recruiting Oracle, Microsoft, and Alphabet to host this service makes them partners–to a degree–instead of simply competitors. And that every company that uses DGX Cloud to run its AI works to make Nvidia the de facto standard for AI processing. Think the switching costs might be a significant deterrent to buying your GPUs from somebody other than Nvidia?
Advanced Micro Devices (AMD): Advanced Micro Devices is chasing Nvidia and I don’t see AMD catching Nvidia in the next generation or two of GPUs.But AMD is used to playing catch-up. The company chased Intel (INC) for years before, I’d argue, essentially catching up and then surpassing that company’s technology. (It has helped that AMD has been able to deliver products, such as the 5-nanometer Genoa CPU chip on time, while Intel has struggled to get smaller architectures out the door on schedule.) My recommendation of AMd isn’t based on it catching Nvidia anytime soon, although the company’s discrete GPUs have been taking modest market share from Nvidia in the gaming market, but on the fact that growth trends in artificial intelligence are so strong that even being No. 2 is a very lucrative position to occupy. And AMD has been making steady inroads into the cloud market which suggests that it will be a solid competitor in the booming sector.
Micro Technology (MU): Memory chips get no–or very little–respect. They’re certainly not as flashy as the Nvidia GPUs that power artificial intelligence. But they are essential to the fast operation of AI. Nvidia’s AI chips inhale huge amounts of data in a single gulp, crunching numbers in one go, then spitting out the results all at once. But for this power advantage to be realized, they need the information to be fed into the computer quickly and without delay. That’s where memory chips come in. GPUs don’t read data directly from a hard drive-— that’s too slow. The best choice would be to use temporary storage on the GPU chip itself. But there simply isn’t enough room. So, the second-best option is to use DRAM memory chips. And lots and lot of them. And the faster the AI processing GPU chip is, the more memory chips are required. Bloomberg estimates that for every high-end AI processor bought, as much as 1 Terabyte of DRAM will be installed. That’s 30 times more DRAM than in a high-end laptop. The AI-driven demand for DRAM chips means that sales of DRAM chips for use in servers are set to outpace that installed in smartphones sometime this year, according to Taipei-based researcher TrendForce. Korea’sSamsung and SK Hynix and Idaho-based Micron together control 95% of the DRAM market. Micron shares have taken a beating lately from Chinese government restrictions on sales in China. I think that’s a buying opportunity.
Applied Materials (AMAT): The logic here is very simple. AI chips aren’t a mature market. We’re still at the stage where chips have to get smaller, operate faster, and use energy more and more efficiently. In this technology, we’re seeing the same kind of arms race that has produced smaller and smaller, faster and faster, and increasingly energy-efficient chips for PCs and smartphones. And somebody has to make the machines that make those new chips. Applied Materials is the global leader in the market for chip-making machinery. (Across the board, I’d argue, in every segment except photolithography, where the company doesn’t compete.) This situation has created its own virtuous cycle where the company pulls in so much more revenue than competitors that it can afford a huge R&D budget–$2.7 billion a year–that enables it to stay ahead of the technology curve.
Taiwan Semiconductor Manufacturing (TSM): When it comes to making smaller, faster, more powerful chips, Taiwan Semiconductor is effectively the only game in town. (To be completely accurate, there are two games in town. But Samsung, the other player, sells its own chips and is not a viable alternative for fabless semiconductor companies looking for someone to manufacture their chips.) The last few generations of smaller, faster, and more powerful chips have pretty much eliminated the competition. The technology is hard and the capital costs are a huge barrier. A technology called FinFET used to make chips at 16/14 nanometers or smaller, eliminated many competitors. The next step is Gate-all-around, or GAA, which will take chips down to 2 nanometers. Taiwan Semiconductor is scheduled to implement GAA mass production in 2025. Samsung is on roughly the same schedule. Intel plans to adopt GAA in its chip-making. Morningstar estimates the R&D costs alone for GAA at $1 billion so you can see why so few chip makers are headed down this path. That leaves Taiwan Semiconductor as the biggest beneficiary of the next waves in AI chip design and manufacturing.
Microsoft (MSFT): Microsoft bridges a number of trends in AI. It is a major Cloud Computing services provider–#2 behind Amazon (AMZN). The company’s legacy software will get a sales boost from the addition of AI features. Microsoft has invested in Open AI, the company behind Chat GPT in an effort to add AI search, learning, and generative output to its Bing search engine. And the company is pushing ahead with plans to make its own chips for servers and Cloud computing. As the recent partnership with Nvidia indicates, through its Azure platform, Microsoft is in a good position to pick up revenue from companies that want to add AI functionality but that don’t want to take on the expense of building out their own internal AI capability.
Alphabet (GOOG): I think recent market thinking that AI and particularly Microsoft’s use of ChatGPT in its Bing search engine is, on balance, a threat to Alphabet is just dead wrong. Ch,aptGPT could enable Bing to pick up a few percentage points of market share–which would be a very profitable development for Microsoft, but Alphabet will clearly be able to create its own AI tool to add to Google search. What AI will do for Alphabet is to enable a company with access to huge amounts of personal data to slice and dice and apply that data to products that certainly include advertising but that are not likely to stop there. I can see Alphabet getting a piece of the AI action through speech recognition, video and image generation, and, of course, its Waymo autonomous vehicle effort. And then there’s the Clooud business where Alphabet, a strong #3, looks ready to turn the corner on profitability.
Illumina (ILMN): So what’s a company that makes machines for genomic sequencing doing on an AI list? Gee, think about it. Drug companies are moving toward therapies tailored to individual patients. All those folks need to be sequenced. Companies looking to find new drug candidates by looking at individual receptors and antagonists in the immune system need to find subtle patterns in a mountain of genetic data. Efforts to increase life spans and postpone the effects of chronic end-of-life diseases are based on studies trying to find genetic markers that correlate to sensitivity to these diseases. The embattled Grail acquisition is a key buy-or-not factor. The acquisition of the liquid cancer biopsy company is an incredible long-term opportunity for Illumina, but right now it is a huge drain on company cash–a $670 million operating loss in 2023–and it’s still not totally clear whether regulators will give the deal that go-ahead. (Someone Illumina failed to get anti-trust clearance before the acquisition.) And the company is probably looking at a few more years before the Food & Drug Administration approves Grail’s Galleri test. (that approval is key to getting reimbursement from insurers for the test. This means Grail is either a gold mine or a huge hole in the ground absorbing earnings. I think Grail will eventually turn into a winner for Illumina–an early detection test for cancer seems a very attractive asset, but I can’t say I know for sure.) Illumina is on this list for both its own AI prospects and as a placeholder for the explosion in AI efforts in the drug sector in general.)
CVS Health (CVS): Here’s one of my general principles for approaching investing in AI: companies that have massive amounts of customer data and own businesses that can find ways to make money from finding patterns in that data are good candidates for big profits from AI. Which pretty much defines CVS Health. The data/marketing/revenue synergies between the company’s retail drug store business, its online pharmacy unit, and its Aetna insurance business seem, to me, to be the kind of opportunity where AI can be used to generate revenue and profit growth. Think about the ability of not just generate a reminder, as the company does now when a prescription is ready for a refill, but the ability to use AI to find a pattern that offers that customer an in-store deal on the products that he or she buys most. Or to suggest a test that a patient’s doctor should recommend on the next visit (and that Aetna will reimburse), or an extension of the current messaging about bundled services–gym membership or home health tests–that the company has identified from patterns in prescriptions or health insurance claims. (You worried about privacy? We all should be but I don’t see much effort on this actually coming from our elected governments.)
Emerson Electric (EMR): AI promises to redefine work–how we do tasks, who does a job, when we do a job, how we do a job most efficiently, how we get a service or product from here to there. Emerson Electric is the leader in process manufacturing technology on this side of the Atlantic. It makes hardware and software and sells services to companies that make things. That’s a $200 billion addressable market now. And the coming of AI, applications means that Emerson will have more “efficiency” to sell and that financial gains from those efficiencies to customers will be real and visible. Which is a formula for getting customers to buy. Nothing like being able to show that a purchase was worth it. Emerson isn’t the only way to play this AI opportunity. I’d take a look at Rockwell Automation (ROK) too. But you should consider a play like this on AI for your portfolio.
I’ve got to stop this list somewhere, but if I were to extend it beyond 10 stocks, I’d take a look at Luminar Technologies (LAZR), Mobileye Global (MBLY), Salesforce (CRM), and Twilio (TWLO).