Channel
Interviewed Person
Theo Browne (t3dotgg)
Vercel just shipped. A lot. Queues, better pricing, captchas, and more. This is a big one... Thank you WorkOS for sponsoring! Check them out at: https://soydev.link/workos SOURCES https://vercel.com/changelog/vercel-queues-is-now-in-limited-beta https://vercel.com/blog/introducing-botid https://vercel.com/blog/introducing-active-cpu-pricing-for-fluid-compute Want to sponsor a video? Learn more here: https://soydev.link/sponsor-me Check out my Twitch, Twitter, Discord more at https://t3.gg S/O Ph4se0n3 for the awesome edit 🙏

Theo - t3․gg
Interviewed: Theo Browne (t3dotgg)
It happened. Verscel actually shipped. I feel like it's been a while since Verscell dropped a bunch of things at once, like many years by now. The last two Next.js comps were relatively low in terms of new features both for Next and for Versell. That changed today. They dropped a ton of things I've been personally waiting for forever. From a new pricing model to cues to a way to run code that's sandboxed from your users to even handling captions properly. It's kind of nuts how much stuff they drop that I've needed for a while. It almost feels like they've been looking through my Twitter and hitting all the check boxes for all the things that I've been bothered by recently.
That all said, Verscell has not paid me for a long time. In fact, quite the opposite. I spend a lot of money paying them nowadays, especially with the success of T3 chat. So, someone's got to cover today's bill. And if I'm going to be honest, it can't be them. So, quick word from today's sponsor, and then we'll dive right into what I think about everything Versel just shipped. One of the biggest changes I'm seeing from AI is the willingness of big enterprise companies to adopt tools by small teams and companies themselves. It's kind of crazy to see that small businesses like my own with T3 Chat are getting interest
from much, much bigger enterprises. There's one big thing that's hard to get right though. O. And no, AI is not going to solve this problem for you. Setting up O in a way that these big businesses are willing to use and adopt and integrate in their systems is something that I wouldn't wish on my worst enemy. And that's why I'm so pumped about today's sponsor, Work OS. These guys made it way easier to get your application and most importantly your authentication ready for enterprise adoption. You can take my word for it or you can look at the absurd number of companies that you're already using software from that have made the move
themselves from cursor to OpenAI to Verscell to Carta to Vanta to so many more. I always smile a bit when I open up the cursor dashboard and see the work OS like offkit signin. It's cool to know that they're using the same tools we use every single day. I've personally chatted with GMO about offplatforms like work OS and he told me really early on that he regretted not adopting one earlier. I think we could have done even more business if we had partnered with work OS earlier. It's been incredibly wellreceived. I couldn't agree more. They found an incredible balance of
making something enterprise ready for these IT teams to integrate and something that's actually nice to use for us as full stack TypeScript developers. It's awesome that they found this balance and we're seeing more and more people adopt it for that reason. If you're tired of thinking about O and are ready to just ship, check out work OS today at soyv.link/workos. Let's dive in. Going to do a real quick overview of the things that they changed so that we can keep ourselves on track. The one that I'm personally most excited about is the active CPU billing. It's going to be a real fun one to talk about and specifically why it matters. They
added the sandboxing for code runs specifically for user submitted code or more importantly AI generated code cues finally and then their capture killer solution that is actually looking very compelling. It seems like they they learned all the right lessons from the things I've been complaining about with captas. If you haven't watched my capture video already, obviously I'm biased cuz I made it, but I think it's one of the best videos I ever did. So check it out if you haven't. very practical, applicable information on how to set up captions and rate limiting in your services properly. They also officially dropped the AI gateway which
is their alternative to something like open router. Actually looks pretty compelling. We'll dive into that too. So let's start with the active CPU billing cuz I am so excited about this. Again, I'm biased because this is going to save me a ton of money. To understand this, we need to understand how servers are built. If I have one server and I don't know, let's say that this server costs us $10 a month. This server is just one
server. If it gets no users or it gets a,000 users or it gets a million users, we have our fixed rate. It's 10 bucks a month. But what happens if this server can only have, I don't know, 500 users concurrently at its peak. Then you need to spin up more servers. Maybe you need to keep them live for the whole month or maybe you're dynamically spinning them up and down. you end up in the the wonderful Kubernetes hell that we've all seen and I've often made fun of as you need to spin up more servers and remember to spin them down and wait for them to spin up and possibly lose traffic in that time. Not fun. And