You ran the experiment. Built the landing page, drove some traffic, waited a week.
Now you're staring at a spreadsheet: 340 visitors, 47 email signups, 12 people who clicked "Get early access," and one reply to your follow-up email from someone asking if it's available yet. And you genuinely don't know what to do with any of it.
Is 47 good? Should you build?
Every smoke test guide ends at "build a page and see what happens." Nobody writes the sequel. Here it is.
The data doesn't speak for itself
The problem isn't that smoke tests are hard to run. It's that the results are easy to misread.
Most founders take the raw signup count, compare it against some vague mental benchmark, feel vaguely good or bad, and move on. But a pageview and a direct message asking "when can I buy this?" are not the same data. They're not even close. Treating them as roughly equivalent versions of "someone showed interest" is how founders convince themselves they have validation when they have something much weaker.
A pageview is someone who found your page. That's it. You know they exist, you know the channel can reach them, and you know your headline didn't repel them in the first two seconds. You know almost nothing else. People scroll through Product Hunt, click links in newsletters, land from Reddit and leave in three seconds. That's all a pageview.
Scroll depth and time-on-page are a step up. If someone read 80% of your page and spent 90 seconds on it, they considered your offer. They thought about it. When a lot of people do this but few sign up, the problem usually isn't that they don't have the pain. Your product just doesn't feel like the right answer to it. The fix is different from if nobody was reading at all.
Email signups are where most founders stop and call it good. This is a mistake. A signup means the problem is real enough that they want to follow your progress. That's something, but it's not the same as willingness to pay. Think about how many newsletters and waitlists you've signed up for that you'd never spend money on. The same applies here. What matters is the conversion rate from cold traffic (not from your followers and friends, more on that in a minute), and whether those people do anything next.
A fake checkout click is different in kind from an email signup. Clicking a "Buy now" button or starting a checkout flow means someone briefly imagined paying. That's a cognitive act you can't easily fake. When fewer than 3% of your signups click into the purchase flow, something is breaking: the price is wrong, the offer is too vague, or you haven't built enough trust before asking. When 10–20% do, you're getting somewhere.
And then there's the person who replied to your follow-up email asking when it launches without being prompted, or found your Twitter to ask about early access. One of these is worth more than two hundred passive signups. They had the problem badly enough to seek you out. The moment this happens, get on a call with that person. A real conversation, not an email thread. What they tell you about their situation: what they've already tried, what they'd pay, what would make them switch. That's the most valuable data you'll get from the whole experiment.
The number that's lying to you
47 signups depends entirely on who those 47 people are.
If you posted your page to your personal Twitter, your newsletter, a Slack community where people know you, or directly to friends, most of those signups are people being supportive. That's a nice quality in people, and it's useless for validation. You can have 200 signups from your network and learn nothing about whether the market wants this.
Product Hunt is the same trap in a different package. PH drives a lot of traffic from people browsing new products on a Tuesday afternoon, not people who have a specific problem and are actively looking for a solution. PH numbers are good for gauging whether your pitch is legible to a general tech audience. They're not good for gauging demand.
The only signups worth analyzing are from cold sources: paid ads targeting a specific persona, organic search, cold outreach, or a community where you're just another member. A 5% conversion from a cold audience is genuinely interesting. A 40% conversion from your followers is noise.
This is the most common reason smoke test results look good and mean nothing. Go back through your analytics, segment by source, and look only at the cold traffic numbers. Start tracking UTM parameters if you aren't already. It's the most important thing to instrument before your next experiment. (If you want this segmentation handled automatically, EarlyProof does it, but even a basic spreadsheet works if you're consistent about it.)
How to make the call
There's no number that definitively means "build it." But there are practical lines.
Under 2% email conversion from cold traffic means the page is broken. Something is wrong with the headline, the audience, or the offer itself. Running more traffic won't improve the signal. Fix the page first.
Decent email conversion with no checkout clicks means people are interested in the problem but not sold on your product as the answer. The work is on the offer: what you're promising, at what price, for whom specifically. This is a different problem than "nobody cares about this."
The real question: do you have people who don't know you, who found your page through a cold channel, who did something beyond passively signing up? If yes (especially if anyone reached out unprompted), you have signal worth following. If no, run more experiments before building.
When someone reaches out unprompted, call them this week. They're telling you they have the problem badly enough to seek you out. Every question they ask and every current workaround they describe is product roadmap information. Founders who build things people actually use tend to be the ones who made these calls and let the answers change their plans.
What the numbers can't tell you
Smoke test results are a snapshot of intent at a specific moment. They expire. The follow-up conversation is what closes the gap between "this looks interesting right now" and "yes, I will pay for this."
Most founders treat the data as a verdict. The better way to read it is as a shortlist: here are the people worth talking to. The conversion rates tell you whether you have enough signal to justify the conversations. The conversations tell you what to build.
When 47 signups is the right answer
Back to your spreadsheet.
340 visitors, 47 email signups, 12 checkout clicks, one unprompted reply asking when it launches.
That 3.5% email conversion is below the range you'd want from cold traffic. But 12 out of 47 people (25%) clicking into the purchase flow is strong. And that one reply? Call them today.
The work now isn't to run more traffic. It's to figure out why the email conversion is lower than the checkout conversion, which is unusual, and suggests your page might be converting the high-intent visitors who click straight to checkout while losing the middle of the funnel. And then call every single person who clicked checkout and find out what they thought they were buying.
That's the experiment. Not "do I have enough signups." The data is the starting point, not the finish line.