Channel
Interviewed Person
Conferences
The turbopack team explains how we are making every build incremental and fast. Get a demo today: https://vercel.com/contact/sales/demo
Vercel
Interviewed: Conferences
[Music] Okay, thank you everyone. Hello, my name is Luke Sandberg. I'm a software engineer at Verscell working on TurboAC. So, I've been at Versell for about six months, uh, which has given me just enough time to come up here on stage and tell you about all the great work I did not do. Prior to my time at Verscell, I was at Google where I got to work on our internal web tool chains and do weird
things like build a TSX to Java bite code compiler and work on the on the closure compiler. So when I arrived at Vcell, it was actually kind of like stepping on to another planet like everything was different and I was pretty surprised by all the things we did on the team and the goals and the goals we had. So today I'm going to share a few of the design choices we made in TurboAC and how how I think they will let us continue to build on the fantastic performance we already
have. So to help uh motivate that this is our overall design goal. So from this you can immediately infer that we probably made some hard choices. So like what about cold builds? Those are important, but you know, one of our ideas is you shouldn't be experiencing them at all. And that's what this talk is going to focus on. In the keynote, you heard a little bit about how we leverage incrementality to improve bundling performance. And the
key idea we have for incrementality is about caching. We want to make every single thing the bundler does cachable so that whenever you make a change, we only have to redo work related to that change. Or maybe to put it another way, the cost of your build should really scale with the size or complexity of your change rather than the size or complexity of your application. And this is how we can make sure that Turopac will continue to give developers good performance uh no matter how many icon
libraries you import. So uh to help understand and motivate that idea, let's imagine the world's simplest bundler, which maybe looks like this. So, uh, here's our baby bundler. And this is maybe a little bit too much code to put on a slide, but it's going to get worse. So, here we parse every entry point. We follow their imports, resolve their references recursively throughout the application to find everything you depend on. Then at the end, we just simply collect everything each entry
point depends on and plop it into an output file. So, hooray, we have a baby bundler. So obviously this is naive but if we think about it from an incremental perspective no part of this is incremental. So we definitely will parse certain files multiple times maybe depending on how many times you import them. That's terrible. Uh we'll definitely resolve the react import like hundreds or thousands of times. Uh so you know ouch.
So if we want this to be at least a little bit more incremental we need to find a way to avoid redundant work. So let's add a cache. So you might imagine this is our parse function. It's pretty simple and it's probably kind of the workhorse of our bundler. You know, very simple. We read the file contents, hand them off to SWC to give us an a. So let's add a cache. Okay, so this is clearly a nice simple
win. Um, but you know, I'm sure some of you have written caching code before. Maybe uh there's some problems here like you know what if the file changes. This is clearly something we care about. Um and you know what if the file isn't really a file but it's three sim links in a trench code. A lot of package managers will organize dependencies like that. Um and we're using the file name as a cache key. Is is that enough? Like you