The number that stopped me was 31.4%. I had set up five landing pages over a Friday evening, sent the same $50 of paid traffic to each one, and by Sunday afternoon four of them had done roughly what I expected — low clicks, a few signups, nothing conclusive. Then I checked the fifth page and spent about ten minutes going back through the setup looking for a mistake.

There was no mistake. One of the five ideas had a signup rate of 31.4% on cold traffic. The follow-up email got a 68% open rate, and the replies were not the usual "sounds interesting" variety. People were writing back with specific situations they were in, asking if I was available to work with them directly.

That was not the outcome I had planned for when I started the weekend.

I had been sitting on five different ideas for about six weeks, having the same circular debates with myself about which one was worth building. Each idea had a plausible case. Each one also had obvious holes. I was not getting anywhere. The weekend test was not an experiment so much as a way to stop the loop — build a minimal page for each idea, get it in front of real people, see which ones moved.

The format I used is the same one I have written about before in The Smoke Test Landing Page Playbook: a headline, three or four lines describing what the thing does, an email field, one button. No pricing, no case studies, no explainer video. Just the proposition and a way to respond. For what counts as a real result, I used the thresholds from my conversion rate post — above 5% on cold traffic is worth pursuing, below 2% is not, and the middle is ambiguous. I spun the pages up through EarlyProof because it was faster than stitching together a page builder, ad tracker, and email form.

Three of the five never got above 1.5%. One reached 9.2%, which I would call a real result. And then the fifth.

The three that failed all had the same problem, which I did not notice until I was looking at them together on Sunday morning. Each headline was describing what the product does. "Automate your client updates." "Get notified about regulation changes before they affect your business." "Prioritize your roadmap based on support volume." All of those are things a person could read, nod at vaguely, and close the tab. Nothing there puts you inside a moment you have actually experienced. Nothing creates that half-second of recognition where you think: yes, that is exactly the thing I ran into last week.

The keyword tool was different. I had been annoyed for years by the fact that most keyword research tools are designed for SEO agencies managing dozens of client accounts, which means a solo writer or creator ends up paying for features they will never use while the one thing they need — which is just "what should I write next" — gets buried under dashboards built for someone else's workflow. The page said that plainly, and the people who felt it signed up. The follow-up conversations were good. Real complaints about specific tools, specific things those tools did that wasted time.

The checklist idea had none of that buildup. The concept was almost embarrassingly simple: a checklist for people who are about to hire a developer for the first time but have not really pressure-tested whether they are ready. Have you actually talked to ten people outside your own network about this problem? Do you know what the first version needs to do, not what you eventually want it to do? Have you tried doing any part of it without software first, just to confirm the problem is real? Questions that seem obvious stated plainly, but that most first-time founders skip because nobody asks them.

That page got a 31.4% signup rate. The replies were not "this looks cool." They were "I needed this two months ago" and "I just signed a contract with a developer and I'm not sure I did any of this." Several people asked whether I did this work with people directly.

I had not been planning to. I had been planning to make a product.

The data changed my thinking. What people were responding to was not a checklist. It was the idea that someone who had done this before was willing to be specific about what goes wrong and when. That is not a software feature. The thing with real demand in those replies was me walking someone through the questions, not the questions formatted as a static page. I shelved the product idea and started taking on consulting work. The checklist became how people found me.

I went into that weekend expecting to pick an idea. Instead I figured out I had been designing the wrong type of thing entirely. A weekend of tests got there faster than six months of building would have.

If you want to run this process yourself without stitching together landing pages, tracking, and email capture separately, EarlyProof is what I use. The 5-Day Sprint framework is the next step if you already have one idea you want to go deep on.