Research — Dec 12, 2024

GenAI breakthroughs and bottlenecks

MediaTalk host Mike Reynolds and fellow S&P Global Market Intelligence reporter Iuri Struta discuss the future of technology and innovation, exploring key insights from the recent Web Summit in Lisbon, Portugal. The summit brought together over 70,000 attendees and covered the evolving landscape of AI, humanoid robots and the competitive dynamics of the tech industry. Struta shares his observations on the open-source versus closed-source debate in AI, the surge of startups challenging established players like Google, and the exciting developments in robotics. He also highlights the importance of data integration in maximizing AI's potential across sectors.

Featured experts:

Iuri Struta, S&P Global Market Intelligence reporter covering tech M&A and capital markets

Explore our full library of S&P Global Market Intelligence podcasts.

SNL Image

 

SNL Image

 

 

 

RELATED RESEARCH:

GenAI funding on track to set new record in 2024

Perplexity AI aims to move search past links to provide answers

 

 

SNL Image

An edited transcript follows below.

Mike Reynolds: Hi, I'm Mike Reynolds, a senior reporter covering the media industry with S&P Global Market Intelligence tech, media and telecom news team. Welcome to "MediaTalk," a podcast hosted by S&P Global, wherein the news and research staff explore issues in the ever-evolving media landscape. Today, I'm joined by colleague Iuri Struta. Iuri is an S&P Global Market Intelligence reporter covering tech, M&A and capital markets, who's going to dive into some happenings in the world of AI, including humanoid robots. That's right, humanoid robots.

How are you doing, Iuri?

Iuri Struta: Doing great, Mike. Thanks for having me.

Reynolds: Iuri recently attended the Web Summit, an important conference in Lisbon, with AI, tech, aspirant web companies, other startups and commerce executives. Kind of right up your coverage alley, huh, Iuri? Tell us about your experiences in Portugal.

Struta: Yeah, I had a great time — sunny, hot weather, unlike here in London. The conference is huge. We have more than 70,000 people attending, so I'd say definitely not enough time to explore everything.

Reynolds: Let's start with the open-source versus the closed-source debate. There was sentiment among attendees in one direction in Lisbon?

Struta: Yeah, I think this is a very long debate since basically the operating system was invented. You had Windows versus Linux, [which was] open-source, Windows closed-source. And then it moved into mobile. You have Android open-source, and then Apple with closed-source, and now you have the same thing in AI. Previously, that was not a very big issue. There are people feeling now that open-source AI can lead to bad outcomes, or it can be used by malicious actors. But on the other hand, open-source leads to a quicker development of AI and GenAI systems of every kind because everyone that does not have the resources to build a big foundation model can have access to it and develop on top of it. And there appears to be more evidence to suggest that bad actors are actually using closed-source more. And of course, at the startup-dominated Web Summit, most of the people were in favor of open-source. But the interesting thing is, if you take it from the side of foundation model providers, or the business side, in the current AI debate, open-sourcing your AI model is a way to compete in this highly competitive market. It's a way to differentiate. You probably have around, what, six or seven frontier foundation models? And two of them are open source, Mistral AI SAS and Meta Platforms Inc.'s Llama. This is how they are competing, basically. They're attracting more people to their systems.

Reynolds: Okay. But at the moment, there isn't a single foundation model that's leading the way, per se?

Struta: Yeah, that's right. I've spoken with about two dozen startups in Lisbon, and of course there was no clear indication of what foundation model is most used. But let's say that startups might start with OpenAI LLC's model. Get access to their API because it's the easiest and quickest, maybe to adopt and arguably one of the best and most competitive. But then I think as the company or the startup progresses, they could try and switch to something that they can fine-tune. This can be a Llama or a Mistral. This is where basically it becomes interesting because when a startup can fine-tune the model, they can target a specific use case or vertical and try to differentiate themselves.

But the other thing is, many of them told me they relied on one or two foundation models, and those startups that are more mature and have more resources, they would typically provide access to all of them. They're basically saying we let the customer decide which one to use. And some of them could have their own, which is fine-tuned on Llama or Mistral. And they would call themselves model agnostic in the business world. And you have Jasper AI Inc. — this is one of the first AI marketing tools. They're one of the first in this business that launched, and this is what they are. They basically use most foundation models; they're letting the customer decide. You have Perplexity AI Inc. and other online search tools that also do the same — they're allowing customers to basically choose between different foundation models.

Reynolds: I think that makes sense, right? You want to have the affinity for working with what the customer wants. That's the customer's always right, as Stu Leonard and many others have said.

A lot of activity in the search realm, Iuri?

Struta: Yeah, I think competition in search is seeing a renaissance after years of being dominated by Google LLC, and no one essentially was daring to venture into this market. So you have a bunch of startups now that are coming with this AI wave, and they're having early success. Google, of course, still dominates. And it's still unclear whether the startups are going to eat market share from Google, but you are seeing that there is a broad feeling in the [venture capital] industry and the startup community that this search business is not as secure as it was. And even Google, I think, feels that as well. You have the AI startups that are attacking, and you also have the [US Justice Department] accusing them of running a monopoly. And even they've made changes to their Google search; they added additional features, seeing that other startups like Perplexity, SUSI are having success with this online search tool that it's more — instead of 10 blue links, you have a text answer directly into your window.

But I think another area where search is very interesting and the market that is in its incipiency is enterprise search. For many years, there were basically no good solutions for the enterprise in the search. Now you're having a lot of them crop up, and it's AI that's basically made this possible. I met with two enterprise search startups in Lisbon, and both of them were already boasting large customers in their countries of origin, Italy and Germany. And I think this is a market to watch because as enterprise data grows exponentially, then you obviously need solutions where you can get to the right thing very quickly. And enterprise search tools offer that. And you have like Green Technologies; they recently raised about $200 million. They are one of the leaders in the space. And then you have Canadian-based Coveo, and you probably have a lot more enterprise search startups. And it's definitely, at a certain point, there is going to be a wave of consolidation among this.

Reynolds: That sounds like a place you're going to be keeping your eye on as we go into '25.

Struta: Absolutely, yeah.

Reynolds: Let's get to the robots. I take it this is not Rosie from "The Jetsons". She was animated metal with a lot of rivets. But what are we talking about here, Iuri?

Struta: I think again, robotics, just like search, you have a growing belief that AI is going to animate the robotics industry. For instance, at the Lisbon conference, Mark Stegmark, this research scientist at MIT, and he has worked in this area for many years. He was saying that what was holding robotics back — and I'm talking about like humanoid robots.

Reynolds: Right.

Struta: Robots, you can speak to them. What was holding them back was the intelligence of them. It was not the actual actuators or the things that they move. Obviously, they still can't move like a human moves. But that was not the main drawback. The main drawback was the intelligence in them. So, with AI, you have that intelligence now, or intelligence has taken a leap, and you combine that with robotics and you could have pretty spectacular results. But we're still very early, and everything is just at the experimentation phase. It could develop pretty quickly, but it could also not; we just have to see. And I'm talking in terms of having robots helping you around the house, and you have a lot of excitement there. You have a lot of money being poured in with Figure AI raising, I think, about $600 million recently. Elon Musk's Tesla Inc., they're making a big bet on robots. So, yeah, that is definitely another area to watch.

Reynolds: You saw some of the prototypes or what have you out on the floor there. Are any of these things ready to come to market in '25 or at least some in '25 or '26?

Struta: Yeah. So we saw a robot from Agility Robotics Inc. The robot was called Digit. So this was a humanoid robot, when I'm saying that. So the robot was basically able — they showcased us — it was able to pick clothes and put them in the basket for receiving an instruction. So you would tell it, "Take the red shirt and put it in the basket," and it would do that. And then it would take the white shirt or the white trousers, put them in the basket, and it would do that as well. Agility has been in the business for a long while, and the robots are designed for commercial purposes like warehouses. We're still not there yet in terms of having humanoid robots, as I said, helping with chores around the house, but I think it's promising what we're seeing now with a lot of startups in this area, even more mature companies getting in.

Reynolds: All right, this is maybe an aside. Did you say Gidget?

Struta: Digit, yeah.

Reynolds: Oh, Digit. I just said Gidget, and I was gonna say the robot looks like Sally Field? Anyway, that's another matter. Let's move on there.

Now we're gonna go from robots to wrappers. And no, we're not talking Jay-Z or Eminem or Nicki Minaj, but rather AI wrappers. That's wrappers with a W. Please explain, Iuri.

Struta: Yeah, so I think this wrappers thing is mostly discussed in VC and investment banking circles. Every VC investor in AI, they tell you they want to buy a company with true AI capabilities, not an AI wrapper, which is basically a company that uses a third-party model and they're adding some additional functionalities, making it easier — maybe some prompt engineering of some sort — and then they target a specific use case or vertical. I think by this token, most AI companies that do not have their own foundation model can be considered AI wrappers. So the difference between AI wrappers is in what you're adding on top of the AI model that you're using. How unique what you're adding is and how easy it is to replicate. If it's easy to replicate, then you probably don't have a business, a VC will tell you. But if it's hard to replicate and it's useful for customers, then this is definitely an AI wrapper that could work. And for instance, if you look at S&P Global, we have a lot of data that's only unique to the company [S&P] only unique to us. The quality of data that you have is not around. So it's an easy-to-read format, machine-readable, and so on. So you can add an AI layer and you've enhanced your product, and no one can do that. Or there are very few companies that can do that. And if you look in the startup world, obviously, it's very hard to come up with some data that's very unique unless you're teaming up with a third-party provider like in healthcare or healthcare institutions that don't know tech very much, but they have a lot of data. And then you're teaming with them on one hand, and you're taking in a foundation model from the third party, and you can have an interesting business there. But a startup could differentiate by other means, speed of deployment, go-to-market strategy. There is a lot of talk now of companies that launch their — and I guess this word is going to take off — "AI native" company. You are an AI native company. When you are an AI native company, then you're more nimble, you're easy to integrate with other systems in the enterprise, and so on.

Reynolds: So AI everywhere, or almost everywhere. But as the world goes this way, there are still some bottlenecks, as you say. We've heard this often before — there are not enough chips at this moment to make everybody happy?

Struta: Yes, we don't have enough chips. We don't have enough AI systems. Key to the AI chips are just two companies: NVIDIA Corp. and Taiwan Semiconductor Manufacturing Co. Ltd. NVIDIA does the design, and TSMC does the manufacturing. And here is where the main bottleneck is. TSMC, for example, they have capacity to make the chips themselves, as many as you like. The smartphone market is pretty slow, so they can repurpose some of that capacity to make AI chips. And this is what they're doing. And this is basically not to print the circuits on the silicon wafer. But they do not have capacity in advanced packaging, and this advanced packaging, this is basically — you take multiple chips and connect them into a single package, and then you add something on top of it, memory, and then other things — networking chips — and you sell them as a system, not as a chip. And TSMC, they said they will double advanced packaging capacity this year, and they will double it again next year. But they still believe it's not going to be enough. And so what I would note here is that for TSMC, this could be a convenient bottleneck. We know that the chip industry is prone to pretty damaging boom-bust cycles. So they may be trying to spread the manufacturing of these chips over multiple years, even though demand is insane. And they don't want to expand capacity rapidly, only to find themselves in two years with a lot of idle factories and forced to sell capacity on the cheap. And the fact that they are the only game in town, essentially, when it comes to manufacturing, that helps them a lot with this convenient bottleneck, let's say.

Reynolds: On top of that, there's an energy shortage?

Struta: Yeah, basically an AI datacenter with NVIDIA Hopper systems is the previous edition; they're consuming 10 times more energy than a traditional datacenter running on CPUs; and Blackwell, the next generation of AI systems from NVIDIA, this will consume like two to three times more than the previous generation. With the speedy buildup, you're obviously going to have an energy shortage. Nobody was prepared for that. Energy consumption has basically been flat over the past many years. There's no growth. I'm talking about the Western markets, the US and Europe. And there is a nuclear energy revival now with AI demand, but our analysts believe the demand will mostly be met with natural gas in the next three to four years — I'm talking about the US — and nuclear is probably a 2030 solution at this time. Natural gas — if you're looking, there is plenty of it. So it's not the natural gas shortage per se, but one of the issues there is the transmission of energy. This is where the bottleneck lies. So this is where you can't just build new transmission lines quickly because of the bureaucracy you have to deal with: permitting issues, NIMBYs, all sorts of things. So even though our S&P ratings analysts, they say to build infrastructure to meet this new demand will cost $15 billion — which is not a lot considering that the hyperscalers are throwing $1 trillion in the next four years on building the datacenters — it's the bureaucracy thing. It's very hard to get because you have to go and speak with people or we're going to make new transmission lines here, and people say, "Okay, fine, you do it," not in my backyard, basically. So yeah, it takes a lot of time.

Reynolds: So those dynamics aside, Iuri, there are some experts in the field like Dario Modell of Anthropic and Ilya Skutsever from Safe Superintelligence. They believe that foundation models may have already achieved their peak scaling, that the rate of improvement over the next models may not be as high. If that's the case, does this risk slowing down spending across the value chain, just like energy software and chip dynamics?

Struta: Yeah, that's a really good question, but I don't think so. We haven't even completely integrated what we have now. We haven't reached that level where we know everything that we can do with the current foundation models. So we're still experimenting and seeing their capabilities. And I think if you integrate these models into your enterprise, this is going to be a much more useful thing than having just a chatbot, and I don't think that it will stop. I think we need a few years just to see where this goes.

Reynolds: At the end of the day here, it's almost always about the data. The phrase "data is the new oil" is taking on a different meaning in the AI realm. Do you believe that, Iuri?

Struta: Yes. Data is a key element that you can differentiate in this world of commoditized AI models, and of course, it's a bottleneck. It's easy for any company to call an API and have a ChatGPT launched across the organization. But the real value in AI, I think, is in how you integrate your models into your systems or different organizational functions, like HR or finance or leadership, stuff like that. And that is easier said than done because you need a lot of preparation for that before you launch an AI model, and you need to invest in infrastructure first — data capture, data management tools, preparation, physical infrastructure, and so on. And this is what we are seeing now: priorities for on-premise infrastructure and cloud infrastructure is going up, and AI technologies themselves are going down in priorities. They're still expanding, but a lot of organizations, they have realized what they try to implement AI technologies, and they figured out, "Oh, we're not yet prepared for that." So we need to go to the basics and start investing more in cloud infrastructure, data management tools and on-premise infrastructure, finding new ways or technology to capture the data, to transmit it, or to prepare it for the future AI models.

Reynolds: Let's go back to Portugal. What were your big takeaways from the Web Summit? What did you learn there about Gen AI, Iuri?

Struta: People are very excited about this new wave — a lot of Gen AI startups, I think about like 1,000 Gen AI startups there. That was one of the hottest topics that has been talked about. Other previously hot themes like FinTech were not very talked about here. So yeah, that's interesting.

Reynolds: That concludes this episode of "MediaTalk." I wanted to thank our colleague at S&P Market Intelligence, Iuri Struta, for taking us through some of the latest in the worlds of AI, humanoid robots and tech investment. This is Mike Reynolds. Thanks to all of you for listening. We'll catch up soon on the next edition of "MediaTalk."

Gain access to our full news & research coverage and the industry-specific data that informs our insights


This article was published by S&P Global Market Intelligence and not by S&P Global Ratings, which is a separately managed division of S&P Global.
Consumer Insights is a regular feature from S&P Market Intelligence Kagan.
Wireless Investor is a regular feature from S&P Global Market Intelligence Kagan.
Data presented in this article is from Kagan's US Consumer Insights surveys conducted in the third quarters of 2017 through 2024. The online surveys included 2,526 (2017), 2,536 (2018), 2,531 (2019), 2,502 (2020), 2,529 (2021), 2,528 (2022), 2,500 (2023) and 2,500 (2024) US internet adults matched by age and gender to the US Census. The survey results have a margin of error of +/-1.9 ppts at the 95% confidence level. Survey data should only be used to identify general market characteristics and directional trends.