Amazon’s AI business is growing over three times faster in its current stage of evolution than its cloud business did, CEO Andy Jassy said during the Q3 2024 earnings call. “And we felt like AWS grew pretty quickly,” Jassy added, referring to the Amazon Web Services (AWS) cloud-computing division. AWS segment sales increased 19 percent year-over-year to USD 27.5 billion, driven by surging demand for AI this year.
Also Read: Amazon Invests USD 4 Billion More in Anthropic, Expands AI Partnership
Amazon AI Business Outpaces Cloud Growth
Jassy highlighted that Amazon’s AI business, which is already growing by triple-digit percentages, is viewed as comprising three macro layers of a stack. Each layer represents a giant opportunity and is progressing rapidly.
The CEO noted that for companies, it is much harder to be successful and competitive in generative AI if their data is not in the cloud. He emphasised that the AWS team continues to make progress in delivering AI capabilities for customers in building a substantial AI business. “In the last 18 months, AWS has released nearly twice as many machine learning and gen AI features as the other leading cloud providers combined,” he said.
Investments in Custom Silicon
While Amazon has a deep partnership with Nvidia, Jassy pointed out that customers are seeking better price performance for their AI workloads. To address this, Amazon has invested in its own custom silicon: Trainium for training and Inferentia for inference. “The second version of Trainium, Trainium2 is starting to ramp up in the next few weeks and will be very compelling for customers on price performance,” he said.
Amazon also announced that it has recently added Anthropic’s Claude 3.5 Sonnet model, Meta’s Llama 3.2 models, Mistral’s Large 2 models, and several Stability AI models to Amazon Bedrock. The company noted that it is integrating generative AI extensively across its businesses, with hundreds of consumer-facing applications already launched or in development.
Also Read: Anthropic, Palantir, and AWS Partner to Bring Claude AI Models to US Defense Operations
AI-Powered Shopping Assistant
“We’ve expanded Rufus, our generative AI-powered expert shopping assistant to the UK, India, Germany, France, Italy, Spain, and Canada,” Jassy stated. “In the US, we’ve added more personalisation, the ability to better narrow customer intent, and real-time pricing and deal information.”
AI Shopping Guides
As reported by TelecomTalk, Amazon has recently introduced AI Shopping Guides for consumers. The company says this will simplify product research by using generative AI to pair key factors to consider in a product category, making it easier for customers to find the right product for their needs.
For sellers, Amazon said it has launched Project Amelia, an AI assistant that offers tailored business insights to boost productivity and drive seller growth. Additionally, Amazon is continuing to rearchitect the brain of Alexa with a new set of foundation AI models. “we’re increasingly adding more AI into all of our devices,” Jassy noted, highlighting the newly announced Kindly Scribe.
Next-Generation Alexa
Responding to a question about how a next-generation Alexa might look when powered by AI agent capabilities, Jassy explained, “The next generation of these assistants and the Generative AI applications will be better at not just answering questions and summarising the indexing and aggregating data, but also taking actions. And you can imagine us being pretty good at that with Alexa.”
“The note-taking experience is much more powerful with the new built-in AI-powered notebook, which enables you to quickly summarise pages of notes into concise bullets in a script that can easily be shared,” he added.
Also Read: Amazon Invests USD 110 Million to Boost AI Research with Free Access to Trainium Chips
Investments into AI Infrastructure
Amazon CFO Brian Olsavsky commented on the business’s growth and future potential. “The business continues to grow and we see an opportunity to expand our core cloud offering and our AI services. Customers increasingly recognise that to get the true benefit of generative AI, they also need to move to the cloud,” he said.
He added that the company expects to spend approximately USD 75 billion in capital expenditures (CapEx) in 2024, primarily to meet the growing demand for technology infrastructure, especially for AWS and AI services.
“This primarily relates to AWS as we invest to support demand for our AI services while also including technology infrastructure to support our North America and international segments,” the CFO added.
Capital Expenditure Driven by AWS and AI
Jassy suggested that CapEx could exceed USD 75 billion in 2025, driven largely by investments in AWS and generative AI infrastructure. “And the majority of it is for AWS and specifically, the increased bumps here are really driven by Generative AI.”
“The thing to remember about the AWS business is the cash life cycle is such that the faster we grow demand, the faster we have to invest capital in data centers and networking gear and hardware. And of course, in the hardware of AI, the accelerators or the chips are more expensive than the CPU hardware. And so we invest in all of that upfront in advance of when we can monetise it with customers using the resources,” he explained. However, he emphasised that many of these assets have long useful lives, such as data centers, which remain operational for 20 to 30 years.
AI, Once in a Life Time Opportunity
Jassy expects that the company can achieve a very successful return on invested capital through its Generative AI initiatives, noting that “It (AI) is a really unusually large maybe once in a lifetime type of opportunity. And I think our customers, the business and our shareholders will feel good about this long-term that we’re aggressively pursuing it.” As the market matures, I believe we will see very healthy margins in the generative AI space, he said.
Olsavsky added that AI is going to play a significant role in its robotics network, with numerous efforts being directed toward that space. “We just hired a number of people from an incredibly strong robotics AI organisation. And I think that will be a very central part of what we do moving forward, too,” he said.
Cloud Demand and Capacity Constraints
Responding to a question about Cloud demand and capacity constraints, Jassy remarked, “I believe we have more demand that we could fulfill if we had even more capacity today. I think pretty much everyone today has less capacity than they have demand for, and it’s really primarily chips that are the area where companies could use more supply. And so we have – we’re growing at a very rapid rate and have grown a pretty big business here in the AI space and it’s early days, but I actually believe that the rate of growth there has a chance to improve over time as we have bigger and bigger capacity.”
Jassy’s comments mirrored those of Microsoft executives, who recently noted that AI demand “continues to exceed available capacity.”
Also Read: AWS Announces Generative AI Partner Innovation Alliance
Commenting on the Nvidia partnership, Jassy said, “We have a very deep partnership with Nvidia. We tend to be their lead partner on most of their new chips. We were the first to offer H200s in EC2 instances. And I expect us to have a partnership for a very long time that matters.”