
TL;DR: We’ve developed the SOTA model for controlling robot arms, allowing our arms to learn new skills in hours instead of weeks. This enables rapid deployment and a unique business model where we can let companies pay by-the-hour, instead of requiring massive capex. We already have a biotech unicorn using our robots in production to package delicate chemical vials.
https://www.youtube.com/watch?v=WHzjweFnbkk&feature=youtu.be
Traditional robot arms often require months of custom engineering and planning to integrate. These bulky, rigid, and expensive systems take up substantial floor space and are hard-coded for a single workflow. Adapting to new tasks means weeks of downtime and reprogramming. As a result, companies often default to increasing headcount to keep up with demand.
Our first product – Nemo3 – is a bimanual robot that can handle objects within a 2.5-foot radius, supports payloads up to 4 lbs, and can be mounted flexibly in various environments. Unlike traditional robot arms, Nemo3 rapidly learns new tasks and adapts to changing conditions on the fly. Our AI decomposes 30 minutes of tele-operation data into skills and uses diffusion models to learn each skill. The result is fast and robust automation for our customers.
As one of our customers put it: “What surprised me most was how fast you set it up. After one weekend it was set up.” It took us just 4 days to get the robot deployed at ABClonal - from unpacking the robot to automating the vial packing task.
Our robots excel at dexterous manipulation tasks. Here are a few industries we can help:
Do you know any manufacturers, distributors, or logistics companies burning time & money on repetitive manual tasks? Maybe they’ve tried other solutions and think it’s impossible to automate?
Reach us at: founders@vernerobotics.com
Book a time: https://cal.com/vernerobotics/30min
Neil is a leading robot learning researcher from Columbia and Stanford. At Columbia he was advised by Prof. Shuran Song and at Stanford he was advised by Prof. Jiajun Wu, as part of the Vision & Learning Lab led by Prof. Fei-Fei Li. Neil holds two patents on multi-modal perception from his time at Apple, where he helped build the Apple Vision Pro. Neil left his PhD at Berkeley to start Verne Robotics.
Aditya led Azure Copilot from private preview to GA Launch on Microsoft’s product team. He graduated Phi Betta Kappa from Cornell University.
🤝 We met in high school tinkering with robots and now our mission is giving every company access to automation.