Warehouses are filled with dull, repetitive tasks still done by people. Ultra makes robots to change that.
TL;DR: Ultra robots automate e-commerce order packaging and returns in fulfillment centers. Traditional automation isn't working for warehouses because it's costly, rigid, and often underutilized. Our robots are different: they’re easy to deploy, resilient to changing environments, and powered by AI that’s trained with examples.
Hi everyone - we’re Oliver, Chetan, Jon, and Max. After years of building and scaling manufacturing and robotics companies (Layer By Layer, S13, and Voodoo Manufacturing, W17), we’re now focused on bringing automation to the industries that need it most.
We believe we’re at a moment in history when robots can be made accessible and capable enough for mass adoption - and we need it more than ever.
Labor is harder to come by than it’s been in decades.
Imagine running a warehouse: you hire 20 temp workers to come in one day, but only 15 show up in the morning, and just 5 return after lunch. That’s the reality one of our partners told us they face.
In 2018, the U.S. entered its first labor shortage in over 60 years. Today, we are short over 1.3 million workers, causing companies to go understaffed or rely on high-churn temp labor.
And yet, traditional warehouse automation isn’t being adopted fast enough.
Ultra is building robots that can be dropped in at existing workstations and quickly trained to do repetitive tasks. They are more cost-effective than human labor and can operate around the clock, giving your team ✨superpowers✨.
We use reliable off-the-shelf hardware so we can focus on rapid deployment. The key to unlocking the next million robots is data scale, and the fastest way to get there is to put robots in real-world environments doing useful tasks.
Recent AI breakthroughs mean we can control robots with neural nets trained with examples, rather than with explicitly programmed routines.
Above is an example of a fully autonomous picking policy we trained on ~2hrs of data from teleoperating our research arms. The two RGB camera feeds are the input to the model, which outputs the joint+gripper positions at 10Hz.
Examples of emergent behavior learned from the training data, but not explicitly programmed like what would be required in traditional automation.
We’d love to connect to e-commerce 3PLs and large brands that handle their own fulfillment. You can reach us at founders@ultra.tech. Thanks!