AWS CEO Matt Garman on productive AI, open source, and closed services

It was surprising when Adam Selipsky left the position of CEO of the Amazon cloud computing unit. What was even more surprising was that Matt Garman followed him. Garman joined Amazon as an intern in 2005 and became a full-time employee in 2006, working on early AWS products. Few people know the business better than Garman, whose last position before becoming CEO was senior VP of AWS sales, marketing, and global services.
Garman told me in an interview last week that he hasn’t made any major changes to the organization yet. “Not a single ton has changed in the organization. The business is doing well, so there is no need to make major changes in whatever we focus on,” he said. However, he pointed out a few areas where he thinks the company needs to focus and where he sees opportunities for AWS.
Re-emphasize implementation and rapid innovation
One of those, surprisingly, is to start. “I think that as we have developed as an organization. … Early in the life of AWS, we focused a lot on how we really attract developers and startups, and we got a lot of early traction there,” he explained. “Then we start looking at how we persuade big businesses, how do we persuade governments, how do we persuade regulated sectors around the world? And I think one of the things that I just re-emphasized – it’s not really a change – but I also emphasize that we’re not going to lose that focus on startups and developers. We have to do all those things.”
Another area he wants the group to focus on is keeping up with the changes that are happening in the industry right now.
“I’ve been emphasizing with the team how important it is for us to continue to be relentless in the leadership that we have in terms of the set of services and capabilities and features and functions that we have today – and continue to lean forward. and build that roadmap for real innovation,” he said. “I think the reason why customers use the -AWS today is because we have the best and most comprehensive set of services. The reason people depend on us today is because we continue to have, by far, the best security and performance in the industry, and we help them innovate and move faster. And we have to keep pushing on that roadmap of things We’ve done it. It’s not really a change, per se, but that’s probably what I’ve emphasized the most: How important it is for us to maintain that level of innovation and maintain the pace at which we deliver.”
When I asked him if he thought maybe the company wasn’t innovating too soon, he argued that he didn’t think so. “I think that the speed of innovation will increase, so it emphasizes that we must also speed up our speed of innovation. It’s not that we lose; that just emphasizes how much we have to continue to accelerate at the current pace of technology.”
Generative AI at AWS
With the advent of artificial intelligence and the way technology is changing now, AWS should also be “on the cutting edge of all of that,” he said.
Shortly after the launch of ChatGPT, many experts questioned whether AWS was too late to introduce productive AI tools and left the opportunity to competitors such as Google Cloud and Microsoft Azure. But Garman thinks this was more of an illusion than reality. He noted that AWS has long offered successful machine learning tools like SageMaker, even before manufacturing AI became a buzzword. He also noted that the company has taken a more deliberate approach to AI production than some of its competitors.
“We’ve been looking at productive AI before it became a widely accepted thing, but I will say that when ChatGPT came out, there was a kind of discovery of a new area, of ways that this technology could be used. And I think everyone was happy and energized about it, right? … I think a lot of people — our competitors — kind of rush to put chatbots on top of everything and show that they’ve been at the forefront of producing AI,” he said.
I think a lot of people – our competitors – are kind of rushing to put chatbots on top of everything and show that they’ve been at the forefront of AI manufacturing.
Instead, Garman said, the AWS team wants to step back and look at how its customers, whether startups or enterprises, can integrate these technologies into their applications and use their decentralized data to do so. “They will want a platform that they can build on and think of it as a platform to build on, unlike the application they are used to. So we took the time to build that platform,” he said.
On AWS, that platform is Bedrock, where it provides access to a variety of open and proprietary models. Just doing that — and allowing users to combine different models — was controversial at the time, he said. “But for us, we thought that maybe this is where the world is going, and now it’s the kind of ending that we didn’t expect that this is where the world is going,” he said. He said he thinks everyone will want custom models and bring their own data to them.
Bedrock, Garman said, “is growing like a weed right now.”
One problem surrounding manufacturing AI that he still wants to solve, however, is price. “A lot of that doubles down to our custom silicon and other model changes to make the impression you’re going to build into your applications. [something] very affordable.”
AWS’ next generation of custom Trainium chips, which the company debuted at its re:Invent conference in late 2023, will launch later this year, Garman said. “I’m really excited that we can really turn that cost curve and start delivering real value to customers.”
One area where AWS hasn’t tried to compete with other technology giants is in building its own large language models. When I asked Garman about that, he noted that those are still something the company is “very focused on.” He thinks it’s important for AWS to have first-party models, all of which continue to rely on third-party models. But he also wants to make sure that AWS’s own models can add unique value and differentiate, either by leveraging its data or “in other areas where we see an opportunity.”
Among those areas of opportunity are costs, but also agents, which everyone in the industry seems to be doing well right now. “Having models that are reliable, at a very high level of integrity, go out and call some APIs and go do things, that’s an area where I think there’s some innovation that can be done there,” Garman said. Agents, he says, will unlock more resources from productivity AI by automating processes on behalf of their users.
Q, a chatbot with AI
At its recent re:Invent conference, AWS also introduced Q, its AI-powered assistant. Currently, there are two flavors of this: Q Developer and Q Business.
Q Developer includes many of the most popular development environments and, among other things, provides code completion and tools for modernizing Java applications.
“We really think of Q Developer as a broader concept to really help with the entire developer lifecycle,” Garman said. “I think a lot of the early developer tools were very focused on coding, and we’re thinking more about how do we help with all the pain and hard stuff for developers to do?”
At Amazon, teams used Q Developer to review 30,000 Java applications, saving $260 million and 4,500 developer years in the process, Garman said.
IQ Business uses the same technology under the hood, but its focus is on aggregating internal company data from a wide variety of sources and making it searchable through a question-and-answer service like ChatGPT. The company is “seeing real traction there,” Garman said.
Termination of services
Although Garman notes that not much has changed under his leadership, one thing that has happened recently at AWS is that the company has announced plans to shut down some of its services. It’s not something AWS does often, but this summer, it announced plans to shut down services like the Web-based Cloud9 IDE, its CodeCommit competitor GitHub, CloudSearch, and others.
“It’s a little bit of a cleanup thing where we’ve looked at a bunch of these services, where, frankly, we’ve introduced a better service that people should be migrating to, or we’ve introduced one that we haven’t done right,” he explained. . We looked at it and said, ‘You know what? Our partner ecosystem has a better solution out there and we’re just going to lean on that.’ You can’t invest in everything. We don’t like to do that. We take it seriously if companies are going to support us for the long term.”
AWS and the open source ecosystem
Another relationship that has long been difficult for AWS – or at least perceived to be difficult – is with the open source ecosystem. That’s changing, and in the past few weeks, AWS has contributed its OpenSearch code to the Linux Foundation and the newly formed OpenSearch Foundation.
We love open source. We depend on open source. I think we’re trying to take advantage of the open source community again he has contributed significantly back to the open source community.
“I think our vision is pretty straightforward,” Garman said when I asked him how he thinks about the relationship between AWS and open source going forward. “We love open source. We depend on open source. I think we’re trying to take advantage of the open source community again he has contributed significantly back to the open source community. I think that’s the whole point of open source – benefit the community – so that’s what we take seriously. “
He noted that AWS has made significant investments in open source and has open sourced many of its projects.
“A lot of the controversy has come from companies that started open source projects and then decided to open source, which I think is their right to do. But you know, that’s not really the spirit of open source. And so whenever we see people doing that, take Elastic as an example of that, and OpenSearch [AWS’s ElasticSearch fork] it was very popular. … If there is Linux [Foundation] project or Apache project or whatever we can depend on, we want to depend on; we contribute to them. I think we have improved and learned as an organization how to be good managers in that community and I hope that has been recognized by others.”
Source link