Hey all!
We’re Eli and Luke – co-founders of Beam.
TL;DR
Beam is an open source alternative to Modal for running AI apps. Coca Cola, Magellan AI, Shippabo, and Stratum use Beam for serverless inference, sandboxes, and background jobs. We're 100% open source and self-hostable.
https://www.youtube.com/watch?v=Itu_jCnyfJc
The Problem
If you're building an AI product, you need a fast serverless platform to run your code, whether that's GPU inference, background jobs, or sandboxes for agents.
Existing cloud platforms either lock you in (Modal) or aren't optimized for serverless AI workloads (Lambda, SageMaker, Kubernetes). Various sandbox providers exist (e2b, Daytona) but don’t support a full range of AI compute workloads.
Our Solution
Beam is your all-in-one serverless cloud for AI applications. You can run inference endpoints, background jobs, sandboxes, and more – all with a simple decorator in your Python or Typescript code. There’s no YAML and no configuration required.
This is possible thanks to a custom container runtime and distributed storage layer we built to run large container images with low cold start. We provide the security and isolation with gVisor, and build and launch new container images lightning fast.
Some of the features we support include:
The Team
Eli and Luke were college roommates and have been building products together pretty much since then.
Before starting Beam, Eli and Luke frequently attended hackathons and built a serverless framework to quickly ship web apps onto the cloud. When they started tinkering with AI, they realized that no serverless framework existed to make it easy to deploy AI models on the cloud. They decided to build it.
Our Ask
If this post resonated with you, we’d love to connect: founders@beam.cloud