Where the team was
The team was VC-backed. The product worked. Support teams at startups were using the shared inbox to handle email and Slack from one place. The pricing was competitive—$1,200/year versus Intercom's $9,600. The value proposition was clear.
But they couldn't prove which marketing channels actually drove revenue.
Google Analytics showed purchases coming from "organic homepage traffic." The outbound campaigns showed decent engagement—open rates, clicks, brand searches spiking. But when they checked the purchase attribution data, none of the emails from the outbound list matched the customer emails in the sales sheet.
The conclusion seemed obvious: outbound wasn't working. All the real customers were coming organically.
So in October, they paused the outbound campaigns.
Purchases crashed.
The misalignment
The attribution data created a paradox. Analytics tagged every purchase as "organic." But when outbound ran in September, purchases happened. When it stopped in October, they dried up. When it resumed in November, purchases returned.
The pattern was undeniable. Outbound correlated with revenue. But the data said otherwise.
"All purchases came from organic homepage traffic, none from outbound campaigns. If purchases were indeed coming from organic, we would not see massive month-to-month swings for the simple reason that organic website traffic remained stable from August to December."
The team knew something was wrong with the attribution model, but they didn't know what.
Meanwhile, the messaging was generic. "AI-powered shared inbox" positioned them in a crowded space with unclear differentiation. The website spoke to seven different industries, trying to be everything to everyone. The homepage hero section confused existing users from their consumer email product—74% of traffic—who didn't understand what "Shared Inbox" was or why they needed it.
And the conversion funnel was hemorrhaging leads. 62% of people who clicked "Start trial" never logged in. Of those who did log in, 91.4% abandoned during an eight-screen onboarding flow. Of those who made it through onboarding and purchased, 50-55% churned—2-3x worse than healthy B2B SaaS benchmarks of 15-25%.
The symptoms were everywhere. The root cause wasn't.
The research
The first hypothesis was attribution. Outbound campaigns weren't driving direct clicks-to-purchase. They were driving brand awareness that appeared as "organic" traffic in analytics.
Pattern: Someone receives an outbound email. They don't click immediately. But the message plants a seed. Two days later, they remember the product name and Google it. They land on the homepage organically. They sign up. Analytics tags it as "organic." The outbound campaign gets zero credit.
This explained the paradox. It also explained why pausing outbound in October crashed purchases. The awareness stopped. The delayed organic searches stopped. Revenue stopped.
But awareness campaigns only work if the messaging resonates. Generic positioning—"AI-powered shared inbox"—wasn't going to stick in someone's memory.
The second hypothesis was messaging. To test it, we needed to understand what actually drove purchase decisions.
50+ customer interviews with support leaders at startups revealed three consistent patterns.
First, cost was the primary driver. Not a factor. The primary driver. 40% of interview mentions. Teams were paying $9,600-$15,000/year for Zendesk or Intercom. Pricing increases were forcing them to evaluate alternatives. When asked what would make them switch, the answer was consistent: "If you're telling me this saves X amount of money because of XYZ, then there is a conversation."
Second, the pain wasn't AI accuracy or feature sophistication. It was simpler. Support teams were drowning in repetitive work, switching between multiple tools, and losing context across Slack, email, and CRM. The language was consistent: "Things get lost." "Time consuming." "Switching between tools." "It's really painful to use."
Third, the desire language revealed what actually mattered: "Save time and cost." "Everything in one place." "Just works." "Simple setup." "Don't have to think about it."
The positioning was upside down. The website led with AI capabilities and vague productivity promises. Customers wanted cost savings and tool consolidation.
The validation
We tested the messaging hypothesis with four outbound cohorts. Different subject lines, different positioning angles, same ICP.
Cohort A led with cost: "Your support tool costs $9,600/year. Your team still answers the same questions 500 times."
Results: 61% open rate (vs 40% target), 29% click rate, +127% branded search spike.
The cost-first messaging worked. People remembered it. They searched for it later. The indirect attribution model was validated.
Next test: website messaging. We rewrote the homepage to match the customer language from interviews. Cost-first hero section. ICP-specific pain points. Honest differentiation—"Email and Slack. That's it."—instead of trying to compete on enterprise features.
A/B test results: +34% average time on site, more pages viewed, better engagement across all metrics.
The positioning worked. The messaging resonated. But purchases still lagged.
The third test was audience fit. Maybe the right message was reaching the wrong people.
Dreamdata analysis: 31.2% of website visitors matched ICP criteria (company size 5-250 employees + industry or geography). 11.3% were strong ICP fits (all three criteria). Both numbers were healthy for a startup at this stage.
The audience was right. The message was right. But something was still breaking.
The final hypothesis was friction. Even with validated messaging and qualified traffic, 62% abandoned between clicking "Start trial" and logging in. Another 91.4% abandoned during onboarding.
The signup flow had eight screens. The first screen alone had seven required fields. The onboarding asked for information the product didn't need yet. Setup felt like work before value was visible.
We redesigned it: three screens, three fields, remove broken steps. Show value faster.
Expected impact: 8.6% → 36.5% activation rate.
By December, the systematic testing had validated every hypothesis:
- Messaging works (61% OR, +127% branded search)
- Positioning works (+34% engagement lift)
- Audience fit is healthy (31% ICP match)
- Friction is the conversion blocker (62% + 91% abandonment)
What was delivered
Research & Validation
- 50+ customer discovery interviews (support leaders at 5-100 person startups)
- Voice of Customer language mapping (pain patterns, desire language, buying triggers)
- ICP validation through Dreamdata (31.2% match, 11.3% strong fit)
- Competitive positioning research (Zendesk, Intercom, Front pricing analysis)
- Attribution model diagnosis (dark social pattern identified)
Strategy & Architecture
- Customer discovery analysis (552 lines synthesizing 50+ interviews)
- Messaging strategy transformation (generic → cost-first positioning)
- 4-phase outbound campaign design with ROI validation gates
- Dual attribution tracking infrastructure (direct + indirect/branded search)
- Tone of voice guide (603 lines based on customer language patterns)
Conversion Optimization
- Homepage messaging rewrite (VoC-driven, A/B tested)
- Funnel leak diagnosis (62% signup abandonment, 91% onboarding abandonment)
- Onboarding flow redesign (8 screens → 3 screens, 7 fields → 3 fields)
- Email sequence redesign (abandoned trial recovery, activation nurture)
- ROI calculator lead magnet with auto-disqualification logic
Go-to-Market Execution
- Outbound cohort testing (validated 61% OR, 29% CTR, +127% branded search)
- Website A/B test (+34% time on site, better engagement)
- Attribution tracking setup (Mixpanel events, UTM persistence, source tracking)
- Campaign performance framework (decision gates, ROI thresholds, segment testing)
Nothing was invented
The customers already knew what they needed. They said it in interviews: "Save time and cost." They wanted cheaper tools that consolidated their stack. They were drowning in Zendesk invoices and switching between five different apps.
The data already showed the pattern. Outbound drove purchases. The attribution was just invisible because traditional analytics don't track delayed conversions or branded searches triggered by campaigns.
The messaging already existed in customer language. "Things get lost." "It's really painful to use." "Getting very expensive." We didn't write new copy. We used their words.
The friction was obvious once measured. 62% abandonment between signup and login. 91% abandonment during onboarding. Eight screens asking for information before showing value. The fixes weren't creative. They were systematic.
The work wasn't inventing strategy. It was making the invisible visible—proving which channels actually drove revenue, extracting positioning from customer interviews, and removing friction the data revealed.
The foundation was already there. It just needed to be organized, validated, and executed.