
Muscle Memory Cache for Agents
Work on Butter’s data layer, building a distributed storage and query engine that scales to thousands of RPS.
Requires extensive experience with systems programming, including compiled systems languages (Go, C++, Rust, Zig), kernel APIs & syscalls, file formats, consensus algorithms, and more. We’re building a new type of database, creating structure out of natural language, and we need your help!
Butter is building an LLM proxy that records and deterministically replays tool call trajectories. Our goal is to get LLMs out of the hotpath for repetitive tasks, increasing speed, reducing variability, and eliminating token costs for the many cases that could have just been a script.
Why
We discovered this problem after experiencing it first-hand building computer-use agents. We realized that many process-automation tasks are deeply repetitive, simple data transformation tasks that could be run as scripts. Critically, the user pull to agents was not to replace these scripts with agents, but to use agents to discover and self-heal the scripts when new edge cases are encountered.
We believe these cases exist even beyond computer-use, to any agent tasked to perform repeat workflows. Learn a skill once, run it forever.
As an LLM proxy, we act as the LLM, spoofing responses to deterministically guide agents down cached paths, or cleanly falling back to actual LLMs on cache miss.