We Wasted 4 Weeks on a $1,000/Month AI Agent
Then We Built Our Own in 89 Minutes
Every SaaS platform you use is shipping an AI agent add-on right now. Your support tool has one. Your CRM has one. Your project management platform probably just announced one last week. They all cost $500–$1,000/month. They’re all black boxes. And if your experience is anything like ours, they mostly don’t work.
We spent four weeks trialing the AI agent add-on from our support ticketing platform. Four weeks of tweaking settings we couldn’t fully control, watching it generate responses we couldn’t tune, running on a model we couldn’t identify and couldn’t swap. It wasn’t bad enough to reject on day one — it was bad in the slow, corrosive way that wastes your time: almost good enough, but never quite right, and no levers to pull to fix it.
Then last week, I sat down with Steve Krouse, the co-founder of Val Town, and we built our own AI support agent from scratch. It took 89 minutes.
Not 89 minutes of prep followed by a weekend of debugging. Eighty-nine minutes, start to finish, from first line of code to a live end-to-end test where a real support ticket came in, hit our Val Town webhook, got routed to Opus 4.5 in the Kilo Gateway, and posted an internal response on the ticket within seconds. We’d blocked out eight hours across two days. We used less than two.
Here’s how we did it — and why I think you could do the same.
Your Platform’s AI Add-On Is a Tax on Your Lack of Options
Here’s the deal with those $1,000/month AI agent add-ons: the vendor has every incentive to use the cheapest model possible. You’re paying a fixed monthly fee, so every dollar they save on inference is margin for them. You can’t see which model they’re using. You can’t adjust the prompts. You can’t change when or how the agent responds. And when it gets something wrong — which it will — you have no way to fix it.
That’s not a tool. That’s a subscription to someone else’s guesses about what you need.
We wanted model flexibility. We wanted prompt control. We wanted to see what was happening and why. And we wanted to swap models when something better or cheaper came along. None of that was available inside the black box.
So we built it ourselves.
The Stack: Val Town + Kilo Gateway
Two tools made this possible at the speed we achieved.
Val Town is a platform for instantly deploying TypeScript without worrying about hosting, infrastructure, or environment setup. You write TypeScript, hit save, and your code is running at a URL within 100 milliseconds. Secrets management, logs, a web editor with full LSP support, and Townie — Val Town’s own AI coding assistant — are all built in. Their own description puts it well: “Val Town is like Zapier for developers.”
The Kilo Code Gateway is an OpenAI-compatible inference API that routes to any of 500+ models — Opus, GPT-5.3, Kimi K2.5, Minimax M2.5, GLM 5 — all through a single endpoint. One API key, full visibility into usage and costs, and the ability to swap models without changing your integration.
If you’re a solopreneur or a small team, this combination is absurdly powerful. You don’t need a backend. You don’t need DevOps. You need an idea and an afternoon.
How We Built It (All 89 Minutes of It)
Steve started in Townie with a simple, methodical approach: one step at a time, rather than throwing everything at the AI and hoping for the best. “I find it really frustrating when vibe coding goes wrong,” he said. The discipline paid off.
Step 1: A webhook endpoint that receives new tickets. We created a new Val, gave it a custom subdomain, and had it accept POST requests from the ticketing platform. At this point it just logged what it received — enough to test the plumbing.
Step 2: Connect the ticketing platform to the webhook. In our ticketing platform’s trigger settings, we created an automation: when a new issue comes in from a specific email, fire the webhook with just the issue ID. Keeping the payload minimal was deliberate — we wanted the agent to fetch the full issue context itself, not rely on whatever truncated data the webhook might carry.
Step 3: Add the Kilo Gateway for inference. The Val receives the issue ID, fetches the full thread via API, and sends that payload, plus our prompt, to the Kilo Gateway. Because the Gateway is OpenAI-compatible, this was literally swapping in a base URL and API key. Steve noted: “That part was super easy.”
Step 4: Post an internal note back to the ticket. This was the trickiest part. The ticketing platform’s data model has issues, threads, and notes — and the relationships between them weren’t immediately obvious from the docs (extended when Townie read the API docs for a different company with the same name as our ticketing platform). After some digging, we landed on the right flow: receive issue → fetch full thread → create a new internal thread → post the AI-generated note to that thread. Clean, auditable, and visible only to our team.
Along the way, we cloned the val with the vt CLI, which works a lot like git, to make it possible to work directly on it with Kilo Code’s full agentic coding capabilities in VS Code.
At the end, we ran a live test. A real ticket came in. The webhook fired. Claude generated a response. An internal note appeared on the ticket within seconds – and it was a totally reasonable response to the ticket.
This was what we were looking for.
The Math That Killed the Black Box
The platform’s AI add-on: $1,000/month. Black box. No model choice. No prompt control.
Our version: We’re currently running Claude Opus 4.5 through the Kilo Gateway. Each ticket response costs roughly 10–20 cents. At about 400 tickets a month, that’s $40–80/month — and we haven’t even started experimenting with cheaper models, which could cut that further.
So: 10x cheaper, with full control over the model, the prompts, and the behavior. And if a better model drops next month, we swap it in with a one-line change.
Val Town’s free tier covers light usage; their paid plans start at $10/month. Total infrastructure cost is negligible.
The Real Point
Four weeks fighting a black box. Eighty-nine minutes building the thing we actually wanted.
That’s not a story about us being especially clever. It’s a story about the tools being ready. Val Town handles the infrastructure. The Kilo Gateway handles the inference. The only genuinely hard part was understanding the APIs of the tools we were already using — and for that, there’s Kilo Code.
If you’re a solopreneur paying $1,000/month for an AI add-on that frustrates you, or a developer looking at your platform’s agent and thinking “I could build something better” — you’re probably right. And it probably won’t take you eight hours.
It might take you 89 minutes.
Want to try the Kilo Gateway? Get started here
Want to see the Val Town setup we built? Try it here

