I’m personally a huge fan of Modal, and have been using their serverless scale-to-zero GPUs for a while. We’ve seen some nice cost reductions from using them, while also being able to scale WAY UP when needed. All with minimal development effort.
Interesting to see a big provider entering this space. Originally swapped to Modal because big providers weren’t offering this (e.g. AWS lambdas can’t run on GPU instances). Assuming all providers are going to start moving towards offering this?
I’m also a big fan.
Modal has the fastest cold-start I’ve seen for 10GB+ models.
Thanks for sharing! They even support running HIPAA-compliant workloads, which I didn't anticipate.
Modal documentation is also very good.
Modal is great, they even released a deep dive into their LP solver for how they're able to get GPUs so quickly (and cheaply).
Coiled is another option worth looking at if you're a Python developer. Not nearly as fast on cold start as Modal, but similarly easy to use and great for spinning up GPU-backed VMs for bursty workloads. Everything runs in your cloud account. The built-in package sync is also pretty nice, it auto-installs CUDA drivers and Python dependencies from your local dev context.
(Disclaimer: I work with Coiled, but genuinely think it's a good option for GPU serverless-ish workflows. )