Channel
Interviewed Person
Lars Grammel
Hi [Music] everyone. Uh I'm Las. I work at Vercel on the AI SDK. And today I want to talk a little bit about AIDSK 5. How the overall landscape has changed. um what the big breaking point is why we developed AICK, why it's such a big change, go a little bit into the details um of how AICK 5 is different from AIK4 and then also look a little bit about upcoming features. So in the last two
years um LLM changed and AI applications changed. If you think two years back LLMs essentially just generated text and then tool calls came in um and they got increasingly sophisticated, right? they started generating images. There was image input and there was audio input. Suddenly the models were like full-on multimodal. Uh MCPS came out. We had thousands of tools. The applications as a result also became more sophisticated.
You want to generate components that you render in the UI. Um you don't just want to have a simple chat anymore, right? You want to have sub agents and a lot of things. And so for AI SDK um as shown in the previous talk by the way I I really love uh seeing things that are built on the AI SDK and I think uh direct native app AI is one of the most amazing things I've seen in terms of providers. So um yeah like big thank you to call stack for doing that. Um can't wait to play with it. This is amazing. Um so AI SDK
abstracts providers and if you add features every month so we did very incremental development eventually you have a problem right for us we got we got stuck in our development because we added features so for to give you an example deepseek reasoning came out I think it was in January so we added deepseeek reasoning we haded to add reasoning to our model specification so the models can send
reasoning information. DeepSick did it in a very simple way. They just send reasoning tokens and um send you reasoning text, right? So nothing nothing fancy what you would think. Month later, Anthropic introduces reasoning. But they don't just introduce reasoning. They sign it. They have retracted parts that are also signed and you have to like send the information exactly back to Enthropic or they would reject the requests, right? And so we baked that in because we wanted to move there relatively fast.
And it worked, but it was ugly because we essentially started bleeding um details of the anthropic API into our abstraction. And then a month later, OpenAI releases what reasoning, right? In this case, as part of um of their um computer use agent and we looked at it and it was close to impossible to do it because when you abstract things and this is what the whole foundation of the ISDK is about. when you abstract things
you want to have several examples ideally like three to five at a minimum but in the AI space we had to always follow very quickly right new feature comes out and everybody wants reasoning the next day right and so you get stuck um and sometime in end of March early April we realized we're stuck not just what the specification is that we added on for a year and a half but also on the UI side where initially it was just text that you render in a chat had then people wanted to have custom data so we
gave them custom data parts but then it got more and more complex. They had different tool calls and was all untyped in the front end right and and so you end up with a lot of spaghetti code on the front end. Uh your apps get hard to maintain. The harder an app is to maintain the less sophisticated it can get and so on and so forth, right? And so we decided we need something better. we need a little bit of a break um to enable you to build really cool AI applications for the web um with all the