Guoabong Investment:2 Artificial Intelligence (AI) Stocks to Buy With $1,000 and Hold for Decades
Last week, Oracle Chairman Larry Ellison offered investors some fresh insights into the current state of artificial intelligence (AI)Guoabong Investment. He said there was no slowdown in sight in terms of business spending on the development of AI. Indeed, he thinks the industry will expand significantly for at least the next 10 years.
Oracle currently has 162 data centers online and under construction, but Ellison thinks that number could reach 2,000 over the long term.
If he's right, this might be a fantastic time to buy AI stocks, especially after the recent sell-off across the technology sector. Investors with $1,000 to put to work now might want to split it equally between shares of chip giant Micron Technology and cloud computing company DigitalOcean .
AI is trained and powered by data center servers that use powerful graphics processing units (GPUs), most of which have been supplied by Nvidia. Memory chips are complementary to GPUs because they store information in a ready state so it can be called upon instantly, which is particularly critical in data-intensive AI workloads.
Micron's HBM3e was selected by Nvidia for inclusion in its new H200 GPU. That high-bandwidth memory architecture offers several benefits over previous generations of chips, including a higher capacity and a smaller physical footprint. Plus, Micron's HBM3e is 30% more energy efficient than competing chips, and electricity consumption is a key consideration for data center operators when selecting hardware.
Micron recently told investors it had already sold out all the data center memory it would be able to deliver in 2024 and 2025, which isn't surprising, given that demand for Nvidia's GPUs is significantly outstripping supply. Some industry sources have said Micron is gearing up to manufacture a new 12-layer HBM3e solution, implying a 50% increase in capacity, which will be ideal for Nvidia's upcoming Blackwell-based GPUs like the B200. So investors should not expect demand for Micron's high-bandwidth memory to slow anytime soon.
However, Micron's opportunities transcend the data center, because some hardware to power AI applications is shifting to personal computers and devices too, thanks to increasingly powerful neural processing units (NPUs) from the likes of Apple and Advanced Micro Devices. Micron says AI-enabled smartphones, for example, require up to twice as much memory capacity than their predecessors. That will be a direct tailwind for the company's revenue.
In its fiscal 2024 third quarter, which ended May 30, Micron delivered an 85% increase in its data center revenue and a whopping 94% increase in its mobile segment revenue, thanks to AI demand. The company will report its final results for fiscal 2024 at the end of September, and analysts expect to see $25 billion in total revenue, a 61% jump from fiscal 2023Lucknow Investment. Wall Street's early forecast for fiscal 2025 points to $38.8 billion in revenue, representing further growth of 53%.
But it gets betterHyderabad Stocks. Analysts predict Micron could generate $9.41 per share in earnings in fiscal 2025, placing its stock at a forward price to earnings (P/E) ratio of just 9.6. That means Micron stock would have to triple to trade in line with Nvidia stock, which currently has a forward P/E ratio of 29.1Mumbai Wealth Management. Considering how closely linked the two companies are, Micron looks like a great value at the current price for investors willing to hold on to the stock for the long term.Indore Stock
Circling back to Ellison's comments, the race to build data centers is being spurred by rising demand for AI cloud services. Providers like Amazon Web Services and Microsoft Azure are leaders in that space, but they primarily target large organizations with deep pockets. DigitalOcean, on the other hand, has carved out a lucrative niche by serving small and mid-sized businesses (SMBs).
DigitalOcean's initial success stemmed from a portfolio of cloud services such as data storage, web hosting, video streaming, and software development tools. It offers cheap and transparent pricing, a simple dashboard with one-click deployment features, and personalized support, all of which are important for SMBs.
However, last year, DigitalOcean acquired Paperspace, which operates several data centers with a range of GPU options designed for AI developers. DigitalOcean recently announced it will allow customers to access fractional GPU capacity on demand, which has never been done before. In other words, SMBs can use between one and eight Nvidia H100 GPUs to power their AI workloads, a tiny scale that cloud leaders simply won't offer because it isn't cost-efficient for them.
We already know that Paperspace can be up to 70% cheaper than equivalent services from Microsoft Azure, for example, because it offers per-second billing with no lock-in contracts. Plus, the company has a lean cost structure because of its narrow portfolio of services, so customers aren't paying for things they may never use. With those services combined with DigitalOcean's new fractional GPU offering, even the smallest businesses can start deploying AI.
DigitalOcean generated $192.5 million in revenue during the second quarter, a 13% increase from the prior-year period. However, its revenue attributable to AI soared by a whopping 200%, suggesting SMBs are latching on to the company's new services at a rapid pace.
Hyderabad Investment
Published on:2024-11-05,Unless otherwise specified,
all articles are original.