← Back to writing

Pre-release validation for moonlighting micro SaaS

How to narrow your hypotheses before writing code in a moonlighting micro SaaS: a three-layer split, and where to set the threshold for shipping code.

Validation costs more when you moonlight

If you're independent full-time, burning 100 hours on a wrong hypothesis costs you a month. When you have 20 hours a week outside a day job, the same 100 hours is five weeks — half a quarter, gone the moment a hypothesis turns out wrong. So in moonlighting micro SaaS the question "how much do I narrow the hypothesis before writing code?" carries more weight than it does for full-time indie hackers.

I run gpdf, gsql, and renma outside my day job as a contract tech lead (the operational shape is in the nadai ecosystem post). The playbook below is what fell out of running two or three projects under a strict 20-hour budget.

The opposite failure is also real, though: too cautious and a year goes by with nothing shipped. Validation isn't a binary. It's a threshold problem — how far do you take it before you start writing code.

Split the hypothesis into three layers

"Validate the market" is too vague to pick a tool against. I split the hypothesis into three layers and apply different methods to each.

  • Layer 1: Does the problem exist? — Is anyone struggling with this?
  • Layer 2: Does my approach work as a solution? — Can what I can ship actually solve it?
  • Layer 3: Will they pay? — Will someone hand over $X/month?

The cost of validation rises as you go down. Layer 1 takes hours. Layer 3 requires real billing events to be proven. Knowing which layers you want confirmed before shipping makes the cut faster, even on a 20-hour week.

Layer 1: Does the problem exist — a few hours' work

Layer 1 is mostly prior-art scanning and search-query observation.

For gpdf, Layer 1 took about two to three hours. The Go PDF landscape: gofpdf was archived in 2021, its fork go-pdf/fpdf was archived in 2025, UniPDF is commercial, gopdf and Maroto sit on the gofpdf lineage. The space for a Pure Go, actively maintained, permissive-licensed library was effectively empty. That alone said "the problem exists." Recurring questions on Stack Overflow and r/golang were a confirming signal.

The trap here is spending too long. I once spent two weeks confirming "is there a problem here," when the answer had been visible after a few hours. The remaining week and a half was me looking for ammunition to feel brave enough to start, not validating anything.

Layer 2: Direction of the solution — ship a working prototype early

This is the hardest layer for moonlighters. The textbook play is to put up a landing page and watch the signups, but in my hands a landing page on its own has rarely told me anything useful.

I tried it once. Three weeks of a SaaS landing page, five email signups, zero who actually replied to a follow-up. The gap between "I'm interested" on a landing page and "I'll try this" is wider than I had assumed.

What worked instead was publishing a working prototype on GitHub early. I made gpdf public at the v0.x stage. With an honest README — what works, what doesn't, what's deliberately out of scope — issues and discussions started to come in: "I tried this and want X," "Y use case isn't covered." Issues against a prototype are a much higher-resolution signal for direction-of-solution than landing page signups, in my experience.

The README discipline matters more than I expected. A README that overpromises ("supports anything!") brings in users who file issues against features that don't exist, which feels like noise. A README that lists current limits invites issues against the right edges — the places where the product would need to grow to fit a real user. That second kind of issue is the thing you want at this layer. It tells you whether the direction you picked has someone on the other end.

The catch is that this is OSS-shaped. It doesn't transfer cleanly to a pure SaaS. Direction validation for a pure SaaS overlaps almost entirely with Layer 3 below, which is why my own strategy is "if it can ship as OSS, start as OSS." Even derived SaaS like gpdf-api gets sequenced as gpdf core first, derivative second.

Layer 3: Will they pay — direct interviews or a real billing button

You can't get "will they pay $X/month" from email signups or a landing page. People answer hypothetical questions kindly and say yes; show them an actual invoice and roughly 80% walk.

Two things have ever worked for me.

  1. 1:1 interviews. A 30-minute conversation, ending with a direct ask: "Could you carve $X/month out of your current budget for this?"
  2. A real billing event. Put up a payment form. Wait for someone to actually charge a card. Charge for the beta if needed.

On a 20-hour week, stacking ten 1:1s takes more than a month, so I cap it at five and move to a real billing form. If three of five sound like they could find the budget, I go; if fewer, I stop. A blunt threshold, but waiting for n=10 costs more than the moonlighting budget can spare.

I haven't reached Layer 3 with gpdf-api yet, so the procedure above is "what I plan to do," not "what I have done." I'll come back and rewrite this in six months, with the actual numbers.

The threshold for starting to write code

Falling out of the three layers, here's the threshold I work to as a moonlighter.

  • Once Layer 1 + 2 are confirmed, it's fine to start writing code.
  • Layer 3 gets validated in parallel with the build.
  • But the minimum feature set has to fit inside "a scope where the first billing event lands within 90 days."

Ninety days is my number, and the basis is thin. But scopes longer than 90 days, in my experience, balloon past Layer 3 validation, and the "building features no one wants" risk creeps back in through the side door. Twenty hours a week times twelve weeks is 240 hours. That's the upper bound I let myself spend before I demand real evidence the bet is wrong, and right now I hold the line on it pretty firmly.

A failure: 200 hours into a project I shelved before launch

A few years ago I shelved a small SaaS I had been building outside work. Mostly weekends, four months, around 200 hours total. I dropped it the week before the beta release, when it became obvious nobody was going to use it.

What went wrong is plain: I confirmed Layer 1 (yes there's a problem), then skipped Layers 2 and 3 and went straight to Layer 4 (build it). No landing page, no interviews. I had told myself "if it works, the response will come."

I handed the beta to five people. Three never opened it. Two said "useful but not right now." I learned that day "not right now" is the polite form of "no price intent." Two hundred hours bought me the lesson that there's a chasm between Layer 1 and Layers 2/3.

To be honest, I still can't draw a clean line through those 200 hours and say "this part was waste." Maybe the first 50 hours of prototyping taught me something. Maybe none of that code was reused. What stuck with me is the rule: don't write a product where Layer 2 or 3 was skipped. The threshold I use today follows the outline of that mistake.

Common questions

"If you were full-time, shouldn't you validate everything up front?"

Probably yes. Full-time means you can stack ten 1:1s before writing code. My threshold is tuned to "what 20 hours a week can do," so don't take it as a full-time playbook. Going independent, I think you'd do better stacking Layer 3 to ten before starting.

"Isn't the LP test outdated?"

I said landing pages on their own don't work for me, but they aren't worthless. They help confirm Layer 1 and they funnel into interviews and billing-button tests. The thing that's outdated is the assumption that an LP alone gets you Layer 3.

"What about ideas that can't ship as OSS?"

My playbook is OSS-shaped, so it doesn't transfer to a pure-SaaS idea cleanly. Maybe the closest substitute is "give a small working slice away free, charge for the rest" — a prototype-shaped trick. I can't speak from experience on pure SaaS validation, since I always take the OSS-first route. Better to read someone who's done it.

Where this goes next

This is the May 2026 state. Layer 3 validation isn't on the books yet. If it falls apart over the next six months, I'll write the postmortem.

© 2026 nadai · JapanSet in Instrument Serif · Inter