Length:
8 min
Published:
April 22, 2026

This article is for builders. If you're the CTO, principal engineer, or tech founder asking "what does it actually take to build an AI-native product end-to-end?", this is the playbook.
It's based on Plantory.ai, the in-house AI-native SaaS we built at DX Heroes — 5,000+ users, 8 locales, 15+ AI pipelines in production. For the why and the founder narrative, see Why We Built Plantory. For the polished case study with metrics, see Plantory.ai — the case study.
An "AI-powered" feature is easy — drop an LLM call in a route handler and you're done. AI-native is harder. It means AI is the substrate of the product and the operation around it, not a topping.
Practically, AI-native splits into five surfaces:
Most teams get one, maybe two, of these. All five together is where the leverage compounds.
Before the tools, three principles. These are the ones we'd keep regardless of model provider.
In Plantory, the garden model is canonical — a 2D canvas with zones, plants at real scale, climate zone, soil type, sun exposure, inventory, history. Every AI call receives this state, not a freeform prompt.
This is the single biggest reason the advisor feels useful. Prompts without grounding are a shrug; prompts with a full spatial model are targeted recommendations. Same models. Different experience.
All persistent state lives behind a NestJS API. The Next.js web app is a presentation + AI streaming layer.
This matters because AI endpoints are inherently streaming, stateful, and failure-prone. You don't want your domain logic, auth, billing, and background jobs tangled up in route handlers. Put the AI where the latency sensitivity is (close to the user) and put the business rules where the durability matters (behind an API with contracts).
Every AI integration earns its place by making something specifically better. A chat pane that answers garden questions is not an AI feature — it's a chatbot. A chat pane that sees the canvas, knows it's USDA zone 7a, knows the user has clay-heavy soil, and remembers that last spring the tomatoes failed — that's an AI feature.
If you can't articulate what context makes the AI work, you don't have an AI product.
Here's what runs in production at Plantory.
/api/gardens/[id]/chat, /api/gardens/[id]/tasks/generate, /api/gardens/[id]/plants/[plantId]/analyze), NestJS for everything else/plantory:paid-performance-review command that audits campaigns and outputs recommendations/plantory:blog-article and /plantory:newsletter skills in our Claude Code plugin. AI drafts the article, we review, and the MDX pipeline handles translation across all 8 locales with automatic linking and metadata.This is the one most teams underestimate. We built an internal Claude Code plugin marketplace. The plantory plugin ships 20+ skills: spec-driven development (/plantory:spec-specify, /plantory:spec-plan, /plantory:spec-breakdown, /plantory:spec-implement), GitHub Project board orchestration (/plantory:board-seed, /plantory:board-ensure, /plantory:board-work), code review, refactor audits, quality gates, release pipelines, founder LinkedIn, Facebook trust-building.
Every workflow we used to do ad-hoc with prompts is now a skill with a known shape, inputs, and outputs. This is the pattern we call Claude Cowork — AI agents as first-class teammates with defined jobs.
The result: specs get planned faster, code reviews are more consistent, GTM boards stay current, and new team members learn the system by running the skills rather than reading docs.
Three honest regrets, in case you're starting now.
Build the eval harness on day one. We got away with vibes for a while. When behavior drifted after a model update, catching it fast required tests we hadn't written yet. Start with a small eval set per AI endpoint. Grow it.
Budget caps before launches, not after. Our first programmatic ad test burned faster than expected because platform-level caps lagged the AI's throughput. Set the platform ceiling first. Let the AI tune within it.
Give the plugin marketplace a first-class home early. We treated ours as "scripts" until we saw the multiplier from packaging workflows as skills. The plugin marketplace pattern — not the individual skills — is the unlock.
If you're building an AI-native product and want a team that's actually done this, let's talk.
Don't miss our best insights. No spam, just practical analyses, invitations to exclusive events, and podcast summaries delivered straight to your inbox.