Product-Led Sales Motion Pack
1) Context + Goal Snapshot
Product: B2B SaaS platform serving data teams (analytics, data engineering, or data infrastructure tool with a self-serve entry point).
Model: 14-day free trial; users can convert to paid self-serve at the end of the trial.
ICP / segment in scope: Data teams (data engineers, analytics engineers, data analysts, data team leads) at companies with 200-2,000 employees (mid-market).
User vs buyer: Users are individual data practitioners who sign up and evaluate the tool. Buyers are typically data team leads, engineering managers, or heads of data who approve paid plans and hold budget. There is a user/buyer mismatch -- the person who activates may not be the person who signs the contract.
Primary objective: Improve trial-to-paid conversion rate by layering a targeted, signal-driven sales assist on top of the existing self-serve trial funnel.
Baseline metrics (current):
- Activation rate: not provided (assumed ~30-40% of signups reach the activation event)
- Trial-to-paid conversion: not provided (assumed ~5-10%, typical for B2B SaaS with 14-day trial)
- Expansion rate: not in scope for this pilot
Target outcome (by when): Increase trial-to-paid conversion by 20-30% relative (e.g., from 8% to 10-11%) within the 4-week pilot for the target segment, without increasing spam complaints or degrading retention/NPS.
Sales capacity + SLA: 2 SDRs; SLA target is first touch within 2 hours of PQL trigger during business hours (9 AM - 6 PM local time). Outside business hours, SLA is first touch within 2 hours of next business day start.
Tooling reality: Assumed -- product analytics (e.g., Mixpanel, Amplitude, or Segment), CRM (e.g., HubSpot or Salesforce), email automation capability, Slack for internal alerts. Specific tooling to be confirmed.
Constraints:
- Do not spam users. Outreach must be helpful and non-intrusive.
- Personalization must reference only user-facing, approved signals (no surveillance language).
- The self-serve conversion path must remain fully intact -- PLS is a layer, not a gate.
- Privacy/compliance: use only data the user has provided or generated through normal product usage. No third-party scraping.
Assumptions & unknowns:
| Assumption | Confidence | Validation needed |
|---|---|---|
| Activation event is well-defined (e.g., "ran first query," "connected first data source," or equivalent) | Medium | Confirm exact event name and definition with Product |
| User-to-account mapping is possible via email domain or workspace | Medium | Confirm identity resolution method with Engineering |
| Baseline trial-to-paid conversion is ~5-10% | Low | Pull actual conversion data from analytics/billing |
| SDRs have CRM access and can log dispositions | High | Confirm CRM workflow with RevOps |
| No regulatory constraints beyond standard B2B SaaS (no HIPAA, no FedRAMP in scope for pilot) | Medium | Confirm with legal/compliance |
Validation plan (next 1-2 weeks):
- Pull actual trial-to-paid conversion rate (overall and by company size segment) from analytics.
- Confirm activation event definition and that it is instrumented.
- Confirm user-to-account mapping method (email domain? workspace ID? CRM enrichment?).
- Confirm SDR tooling (CRM, email, Slack channel for alerts).
- Review with legal/compliance that usage-signal-driven outreach is approved.
Readiness verdict: READY (with noted assumptions). The company has a defined trial, known usage signals (activation, invites, integration setup, billing page views), an ICP, and sales capacity. Proceed with explicit assumptions; validate in parallel during Week 1 of the pilot.
2) PLS Funnel + Ownership Map
Funnel Design
The product-led sales layer intervenes at the "Qualified (PQL)" stage for high-signal trials, while all other trials continue on the standard self-serve path. Sales does not intercept the self-serve funnel -- it runs in parallel for qualified accounts only.
| Stage | Entry Criteria | Exit Criteria | Primary Owner | Supporting Owners | Intervention? | SLA | Notes / Guardrails |
|---|---|---|---|---|---|---|---|
| Acquisition / Signup | Visitor signs up for 14-day trial | Account created, welcome email sent | Marketing / Product | -- | No | -- | No sales touch at signup. Let users explore. |
| Activation / First Value | Account created | User completes activation event (e.g., first query run, first data source connected) | Product | CS (in-app guidance) | No | -- | Focus on product-led activation (tooltips, onboarding checklist). No sales outreach here. |
| Sustained Usage | Activation event completed | User demonstrates depth + breadth signals (multi-day usage, invites, integrations) | Product | CS | No | -- | Self-serve path continues. Product monitors signals passively. |
| Qualified (PQL) | PQL score threshold met (see Section 3) | SDR triages and accepts/rejects | Product / RevOps | Sales | Yes -- trigger | Alert within 15 min of qualification | Only accounts matching ICP + signal threshold are routed. Volume cap: max 5 new PQLs/SDR/day. |
| Sales Assist | SDR accepts PQL, initiates outreach | Meeting set OR disposition logged (nurture/disqualified/no response) | Sales (SDR) | Product / CS | Yes -- active | First touch within 2 hours | SDR sees usage context dashboard. One clear ask per outreach. Max 3-touch sequence per PQL. |
| Purchase / Upgrade | Meeting set or buyer engaged | Trial converts to paid plan | Sales (SDR hands to AE if deal > threshold) or Self-serve | Finance / CS | Yes (if deal >$X) | -- | Self-serve conversion remains available at all times. No gating. |
| Onboarding (Paid) | Paid plan activated | Customer reaches "time-to-first-value" on paid plan | CS / Product | Sales (warm handoff) | Maybe | -- | SDR does warm handoff to CS. CS owns onboarding. |
Guardrails (Protecting the Self-Serve Path)
- No sales gating: Users can always convert self-serve via in-app billing. Sales outreach is additive, never blocking.
- Volume cap: Each SDR handles a maximum of 5 new PQLs per day to ensure quality outreach and prevent overwhelm.
- Cooling period: If a user converts self-serve during the outreach sequence, the sequence stops automatically.
- Opt-out respected: If a user replies "not interested" or unsubscribes, they are permanently removed from sales outreach for this trial period.
- Small accounts excluded: Accounts with fewer than 200 employees (below ICP) are not routed to sales regardless of signals.
3) PQL Definition + Signal Spec
Qualified Unit: PQL (Product-Qualified Lead)
Why PQL (not PQA): The primary objective is trial-to-paid conversion, which is user-initiated in a 14-day window. The buyer is typically the user's manager, but the user is the champion and initial evaluator. Individual user behavior (depth + breadth + intent) is the strongest predictor of conversion likelihood. Account-level PQA scoring is appropriate for expansion but is out of scope for this pilot.
Definition (1-2 sentences): A PQL is a trial user at an ICP-fit company (200-2,000 employees) who has completed the activation event AND demonstrated at least two additional high-intent signals (invites, integration setup, or billing page view) within the first 10 days of their trial.
In-scope segment: Data teams at companies with 200-2,000 employees, currently on a 14-day free trial.
Required signals (must have ALL of the following):
- ICP fit: Company size is 200-2,000 employees (from enrichment or signup data).
- Activation event completed: User has completed the defined activation milestone (e.g., ran first query, connected first data source).
- At least 2 of 4 intent/depth signals met (see Signal Spec table below):
- Invited 1+ teammates
- Set up 1+ integration
- Viewed billing/pricing page 2+ times
- Used product on 3+ distinct days
Thresholds:
- Signals must occur within the first 10 days of the trial (leaving ~4 days for sales to engage before trial expires).
- Minimum 2 of 4 intent/depth signals required (in addition to activation + ICP fit).
Exclusions (NOT qualified if):
- Company has fewer than 200 employees or more than 2,000 employees
- User signed up with a personal email domain (gmail.com, outlook.com, etc.) and company cannot be resolved
- User has already converted to a paid plan (route to expansion, not conversion)
- User has previously been contacted by sales in this trial and dispositioned as "disqualified" or "no response" (do not re-route)
- Trial has fewer than 3 days remaining AND no billing page views (insufficient time for sales cycle)
- Account is a known competitor or internal test account
Notes on false positives / anti-gaming:
- Bot/test accounts: Exclude accounts with signup email matching known test patterns (e.g., test@, demo@, +test@).
- Single-session spikes: Require signals across 2+ distinct days to filter out one-time explorers.
- Billing page views alone are not sufficient: Must be combined with activation + at least one other signal. A billing page view without activation often indicates a price-checker, not a buyer.
- Weekly threshold review: During the pilot, review PQL volume and acceptance rate weekly. If acceptance rate drops below 50%, tighten thresholds.
Example qualified case:
Sarah, a data engineer at Acme Corp (800 employees), signed up for a trial on Day 1. By Day 5, she connected a Snowflake data source (activation), invited 2 teammates (breadth), and set up a dbt integration (integration). She has not viewed the billing page yet, but she meets activation + 2/4 intent signals. She is a PQL.
Example NOT qualified case:
Tom, a data analyst at a 50-person startup, signed up on Day 1 and ran a query (activation). He has not invited anyone, set up integrations, or viewed billing. He does not meet ICP fit (< 200 employees) and has only 0/4 intent signals. He is NOT a PQL -- he continues on the self-serve path.
4) Signal Spec + Scoring Model
Signal Catalog
| Signal | Type | How Measured (Event + Properties) | Threshold | Weight | Why It Indicates Value/Intent | False-Positive Risks | Data Source |
|---|---|---|---|---|---|---|---|
| Activation event completed | Aha | activation_complete event (e.g., first_query_run or first_source_connected) | 1+ occurrence | Required gate (no score without this) | User has experienced first value; without this, outreach is premature | Low -- well-defined event | Product analytics |
| ICP firmographic fit | Fit | Company size from enrichment (Clearbit, signup form, or email domain lookup) | 200-2,000 employees | Required gate | Mid-market companies are the target segment; outside this range, conversion economics differ | Medium -- enrichment data may be stale or missing | Enrichment tool / CRM |
| Teammate invites sent | Breadth | invite_sent event; count distinct invitees | >= 1 invite | 3 points | Inviting others signals organizational buy-in and multi-user intent | Low -- hard to game accidentally | Product analytics |
| Integration setup | Intent / Depth | integration_connected event; integration type property | >= 1 integration connected | 3 points | Integrations indicate serious evaluation (user is embedding the tool into their stack) | Low -- meaningful setup effort | Product analytics |
| Billing/pricing page views | Intent | page_view event where page = /billing or /pricing | >= 2 views on 2+ distinct days | 2 points | Repeated billing page visits signal purchase consideration | Medium -- could be price-checking without intent; requires 2+ views to filter | Product analytics |
| Multi-day usage | Depth | session_start event; count distinct active days | >= 3 distinct days | 2 points | Returning users are more likely to convert than one-time explorers | Low -- straightforward metric | Product analytics |
Scoring Rules
- Score window: Rolling 10-day window from signup date (Days 1-10 of trial).
- Required gates (must pass before scoring):
- Activation event completed = YES
- ICP firmographic fit = YES
- Point scoring (after gates pass):
- Teammate invites (>= 1): +3 points
- Integration setup (>= 1): +3 points
- Billing page views (>= 2 on 2+ days): +2 points
- Multi-day usage (>= 3 days): +2 points
- Maximum possible score: 10 points
- Minimum score to qualify as PQL: 4 points (i.e., at least 2 of 4 intent/depth signals)
- Priority tiers:
- High priority (7-10 points): 3-4 signals met. Route immediately. SDR contacts within 2 hours.
- Medium priority (4-6 points): 2 signals met. Route within same business day. SDR contacts within 4 hours.
- Below threshold (< 4 points): Do not route. User stays on self-serve path. Re-evaluate daily as new signals arrive.
- Decay/recency rule: Signals older than 10 days from signup are not counted. If a trial user has not met the threshold by Day 10, they are not routed (the remaining 4 days of the trial are too short for a sales cycle; they continue self-serve).
- Triage rule for ambiguous signals: If a PQL is at exactly 4 points but all signals occurred on a single day, hold for 24 hours. If no additional activity, downgrade to "monitor" (do not route). This filters one-time explorers.
5) Routing + Workflow Spec
Alert Delivery
Where alerts go:
- Primary: Dedicated Slack channel
#pql-alerts-- an automated message with PQL details is posted when a user crosses the threshold. - Secondary: CRM task created automatically (assigned to the designated SDR) with all signal context.
- Escalation: If no SDR claims the PQL within 1 hour, an
@hereping is sent in Slack. If no action within 2 hours, alert escalates to Sales Manager.
Assignment Logic
- Round-robin between the 2 SDRs, with load balancing (if one SDR has 5 active PQLs and the other has 2, the next PQL goes to the lighter-loaded SDR).
- Named account override: If an account is already in CRM with an assigned owner, route to that owner instead (prevents duplicate outreach).
- Capacity cap: Max 5 new PQLs per SDR per day. If the cap is hit, additional PQLs queue for the next business day (this is a pilot guardrail).
SLA
| Priority | Time-to-First-Touch Target | Escalation if Missed |
|---|---|---|
| High (7-10 points) | 2 hours (business hours) | Alert Sales Manager after 2 hours |
| Medium (4-6 points) | 4 hours (business hours) | Alert Sales Manager after 4 hours |
Required Rep Context (What the SDR Sees)
When a PQL alert fires, the SDR receives:
- User info: Name, email, title (if available), company name, company size
- Trial info: Signup date, days remaining in trial, plan/tier
- Signal summary: Which signals were triggered and when (e.g., "Activated Day 2; invited 2 teammates Day 4; connected Snowflake integration Day 5")
- PQL score: Total points and priority tier
- Suggested talk track: 1-line recommendation based on strongest signal (e.g., "Lead with integration setup help -- user connected Snowflake, likely building a production workflow")
- Account history: Any prior interactions, support tickets, or previous trials
Required Actions
- Claim: SDR claims the PQL in Slack (
/claim @user) and the CRM task is updated. - First touch: SDR sends the first outreach (email or call) using the outreach kit templates within the SLA window.
- Log: SDR logs the touch in CRM with date, channel (email/call/LinkedIn), and a brief note.
- Disposition: After the sequence completes (max 3 touches over 5 business days), SDR sets a disposition.
Disposition Taxonomy
| Disposition | Definition | Next Action |
|---|---|---|
| Meeting Set | User/buyer agreed to a meeting or demo | SDR conducts meeting or hands to AE (if deal qualifies) |
| Converted (Self-Serve) | User converted to paid during the sequence | Stop sequence; warm handoff to CS for onboarding |
| Nurture | User engaged but not ready (e.g., "ask me next quarter") | Add to nurture sequence; re-evaluate if new signals appear |
| No Response | No reply after 3 touches | Close sequence; do not re-route unless new strong signals appear |
| Disqualified | Bad fit, wrong person, not evaluating seriously | Log reason; exclude from future PQL routing for this trial |
| Routed to CS | User needs technical help, not sales | Create CS ticket; SDR hands off |
Product-Sales Feedback Loop
Weekly review (30 minutes): Every Friday at 10 AM.
Attendees: Sales Manager, both SDRs, Product Manager (or RevOps lead).
Agenda:
- Volume check (5 min): How many PQLs fired this week? How many per SDR per day on average?
- Acceptance rate (5 min): What % of PQLs were accepted (Meeting Set + Converted) vs rejected (Disqualified + No Response)? Target: >= 50% acceptance rate.
- Signal review (10 min): Which signals are the strongest predictors? Any signals that are noisy (high false-positive rate)? Propose threshold adjustments.
- Outreach review (5 min): Which email templates are getting replies? Any patterns in objections? Update templates if needed.
- Action items (5 min): Threshold changes, new signals to add, template updates, capacity adjustments.
Escalation triggers:
- PQL volume > 10/SDR/day for 3+ consecutive days -> tighten thresholds or add capacity.
- Acceptance rate < 30% for 2 consecutive weeks -> re-evaluate signal definitions.
- SDR SLA compliance < 80% -> investigate capacity or process issues.
6) Usage-Triggered Outreach Kit
Guiding Principles
- Be helpful, not creepy. Reference the value the user is trying to unlock, not the data you observed.
- One clear ask per message. Do not stack CTAs.
- Low-friction next step. Offer a 10-minute call, a setup checklist, or office hours -- not a 60-minute demo.
- Max 3 touches. If no response after 3 touches over 5 business days, stop. Respect silence.
- Approved signals only. Reference integrations, team setup, and goals -- never mention login counts, session durations, or internal scoring.
Email 1 -- Initial Outreach (Day 0: PQL trigger day)
Use when: PQL threshold is met. Tailor the lead-in to the strongest signal.
Variant A: Integration Signal Strongest
Subject: Quick help connecting [Integration Name] to [Product]
Body:
Hi [First Name],
I work with data teams getting set up on [Product]. Looks like you're connecting [Integration Name] -- nice choice for [common use case, e.g., "syncing your warehouse data"].
A few teams your size have found it helpful to walk through the setup together. I can share a quick checklist and hop on a 10-minute call to make sure everything's wired up right.
Would [Day/Time Option A] or [Day/Time Option B] work?
Best, [SDR Name]
Variant B: Team Invite Signal Strongest
Subject: Getting your team up and running on [Product]
Body:
Hi [First Name],
I noticed you're bringing teammates into [Product] -- that's usually a sign things are clicking.
When teams start collaborating, a few quick setup steps (permissions, shared workspaces, naming conventions) can save a lot of headaches later. Happy to share what's worked for similar data teams and answer any questions.
Open to a quick 10-minute call this week?
Best, [SDR Name]
Variant C: Billing Page Signal Strongest
Subject: Questions about plans for [Product]?
Body:
Hi [First Name],
I help data teams figure out the right plan on [Product]. If you're evaluating options for your team, I can walk you through what similar companies your size typically choose and help you avoid overpaying.
Want to grab 10 minutes this week?
Best, [SDR Name]
Email 2 -- Follow-Up (Day 2-3: If no response to Email 1)
Subject: Re: [Original subject line]
Body:
Hi [First Name],
Following up -- wanted to make sure this didn't get buried.
I put together a short setup guide for data teams on [Product] that covers [relevant topic based on their signals: e.g., "integration best practices" / "team collaboration setup" / "choosing the right plan"].
Here's the link: [Link to relevant help doc or resource]
If you'd rather just chat, I'm around [Day/Time]. No pressure either way.
Best, [SDR Name]
Note: Email 2 leads with value (a resource) rather than another meeting request. This respects the user's time and provides help regardless of whether they reply.
Email 3 -- Final Touch (Day 5: If no response to Emails 1-2)
Subject: One last thing before your trial wraps up
Body:
Hi [First Name],
Your trial is winding down in [X days]. Totally fine if the timing isn't right -- [Product] will be here when you need it.
If you do want to keep going, I can help you:
- Lock in the right plan for your team size
- Transfer any work you've done in the trial
- Answer any last questions
Just reply here or grab a slot: [Calendly/scheduling link]
Best, [SDR Name]
Note: Email 3 is the last touch. If no response, disposition as "No Response" and stop. Do not send a 4th email.
Call Opener + Discovery Prompts (For Meetings / Live Conversations)
Opener:
"Thanks for taking the time, [First Name]. I've been working with data teams getting set up on [Product] and wanted to see what you're trying to accomplish and if there's anything I can help with. What's the main thing you're hoping to get out of the trial?"
Discovery prompts (choose 2-3 based on conversation flow):
- "What outcome are you trying to get to in the next 2-4 weeks with your data stack?"
- "Who else on your team needs to be involved for this to work long-term?"
- "What's the current process look like without [Product], and what's the pain point?"
- "If we could help you [unlock specific value based on their signals], what would a good next step look like on your end?"
- "Is there a timeline or event driving this evaluation (e.g., a migration, a new data project, budget cycle)?"
- "What would you need to see to feel confident recommending [Product] to your manager?"
Objection handling guidance:
| Objection | Response Approach |
|---|---|
| "We're just exploring, not ready to buy" | "Totally fair. Would it help if I shared what setup steps are worth doing now vs later, so you don't lose your work if you decide to come back?" |
| "I need to talk to my manager" | "Makes sense. Want me to put together a 1-page summary of what you've set up and the plan options? That might make the conversation easier." |
| "Too expensive" | "Let me understand what you need -- we might have a plan that fits better. What's your team size and main use case?" |
| "We're using [Competitor]" | "Got it. What made you try [Product] alongside [Competitor]? Curious what gap you're looking to fill." |
Follow-Up Rules
| Scenario | Action | Timing |
|---|---|---|
| Email 1 sent, no response | Send Email 2 | 2-3 business days after Email 1 |
| Email 2 sent, no response | Send Email 3 | 2 business days after Email 2 |
| Email 3 sent, no response | Disposition as "No Response" and stop | After 2 business days |
| User replies positively | Schedule meeting; stop email sequence | Immediately |
| User replies "not interested" | Disposition as "Nurture" or "Disqualified"; stop sequence | Immediately |
| User converts self-serve | Stop sequence; warm handoff to CS | Immediately |
| Meeting completed, next steps agreed | Log next steps in CRM; follow up per agreement | Per meeting outcome |
7) Instrumentation + Reporting Plan
Tracking Plan
| Event Name | Properties | Source | Status |
|---|---|---|---|
signup_complete | user_id, email, company_name, company_size, signup_source | Product analytics | Assumed available |
activation_complete | user_id, activation_type, timestamp | Product analytics | Assumed available (confirm exact event name) |
invite_sent | user_id, invitee_email, timestamp | Product analytics | Assumed available |
integration_connected | user_id, integration_type, timestamp | Product analytics | Assumed available |
page_view (billing/pricing) | user_id, page_path, timestamp | Product analytics | Assumed available |
session_start | user_id, timestamp | Product analytics | Assumed available |
pql_triggered | user_id, pql_score, priority_tier, signals_met, timestamp | PQL scoring system (to build) | Needs instrumentation |
pql_claimed | user_id, sdr_id, timestamp | CRM / Slack integration | Needs instrumentation |
outreach_sent | user_id, sdr_id, touch_number, channel, timestamp | CRM | Needs instrumentation |
pql_dispositioned | user_id, sdr_id, disposition, timestamp | CRM | Needs instrumentation |
trial_converted | user_id, plan, mrr, conversion_source (self-serve vs sales-assisted), timestamp | Billing system | Assumed available (may need "source" tagging) |
Instrumentation Gaps to Close (Week 1 of Pilot)
- PQL scoring automation: Build or configure the scoring logic (can be a simple script, a Segment function, or a CRM workflow) that evaluates signals daily and fires
pql_triggeredevents. - Slack-to-CRM integration: Set up the
#pql-alertsSlack channel with automated posting from the scoring system, and/claimcommand or equivalent to log assignment. - Conversion source tagging: Tag trial conversions as "self-serve" or "sales-assisted" in the billing system so you can measure PLS incremental lift.
- Disposition logging: Ensure CRM disposition fields are created and required before closing a PQL task.
Dashboard Spec
Dashboard 1: PQL Pipeline (updated daily)
| Metric | Definition | Target (Pilot) |
|---|---|---|
| PQLs triggered (total) | Count of pql_triggered events per day/week | 3-8 per day (manageable for 2 SDRs) |
| PQLs triggered by priority | High vs Medium tier breakdown | -- |
| PQL acceptance rate | (Meeting Set + Converted) / Total PQLs dispositioned | >= 50% |
| SLA compliance | % of PQLs with first touch within SLA | >= 80% |
| Avg. time-to-first-touch | Mean time from pql_triggered to outreach_sent (touch 1) | < 2 hours (high), < 4 hours (medium) |
Dashboard 2: Conversion Impact (updated weekly)
| Metric | Definition | Target (Pilot) |
|---|---|---|
| Trial-to-paid (PQL cohort) | Conversion rate for users who were PQL-routed | >= 15% (vs assumed 8% baseline) |
| Trial-to-paid (non-PQL cohort) | Conversion rate for users not routed (self-serve only) | Stable or improved (no degradation) |
| Sales-assisted conversion rate | PQLs that converted / PQLs contacted | >= 12% |
| Meeting set rate | Meetings set / PQLs contacted | >= 20% |
| Reply rate | Email replies / Emails sent | >= 15% |
| Avg. touches to conversion | Mean outreach touches before conversion (for converted PQLs) | Monitor (no target yet) |
| Revenue from PQL conversions | MRR from sales-assisted conversions | Monitor |
Dashboard 3: Health & Guardrails (updated weekly)
| Metric | Definition | Guardrail |
|---|---|---|
| Spam / unsubscribe complaints | Count of negative replies or unsubscribe requests from PQL outreach | < 2% of contacted PQLs |
| NPS / CSAT (PQL cohort vs control) | User satisfaction score for PQL-contacted vs non-contacted | No degradation (within 5 points) |
| SDR utilization | Active PQLs per SDR per day | 3-5 (not > 5) |
| False-positive rate | % of PQLs dispositioned as Disqualified | < 20% |
| Self-serve conversion (overall) | Overall self-serve conversion rate (not routed to sales) | No drop vs pre-pilot baseline |
8) Pilot + Scale Plan
4-Week Pilot Plan
Pilot segment: Trial signups from companies with 200-2,000 employees (mid-market ICP). All other segments continue on self-serve only (no change).
Duration: 4 weeks (28 days).
Estimated sample size: Assuming 50-100 new ICP-fit trial signups per week, ~200-400 total trial users in the pilot segment. Of these, an estimated 20-30% will meet PQL threshold (~40-120 PQLs over 4 weeks, or ~10-30 per week / 2-6 per day).
Inclusion criteria: Trial user must match ICP (200-2,000 employees) AND meet PQL score threshold (>= 4 points) within 10 days of signup.
Week-by-Week Plan
Week 0 (Pre-Pilot: Setup)
Goal: Instrument, configure, and enable.
| Task | Owner | Done? |
|---|---|---|
| Validate activation event definition and instrumentation | Product / Engineering | [ ] |
| Confirm user-to-account mapping (email domain or workspace) | Engineering | [ ] |
| Build PQL scoring automation (daily evaluation of signals) | Engineering / RevOps | [ ] |
Set up #pql-alerts Slack channel with automated posting | RevOps / Engineering | [ ] |
| Create CRM fields: PQL score, priority tier, disposition, conversion source | RevOps | [ ] |
| Configure round-robin assignment logic | RevOps | [ ] |
| Load outreach templates into email tool (3-touch sequence) | SDRs / RevOps | [ ] |
| Build Dashboard 1 (PQL Pipeline) | RevOps / Analytics | [ ] |
| SDR enablement session: review PQL definition, outreach kit, workflow, dispositions | Sales Manager | [ ] |
| Pull baseline trial-to-paid conversion rate (last 90 days, ICP segment) | Analytics | [ ] |
| Legal/compliance sign-off on outreach approach | Legal | [ ] |
Week 1 (Pilot Launch: Calibrate)
Goal: Launch PQL routing. Calibrate thresholds and workflow. Expect iteration.
- Mon: Go live with PQL scoring and routing for ICP segment.
- Daily: SDRs work PQLs per SLA. Log all dispositions in CRM.
- Wed (mid-week check, 15 min): Quick sync -- are PQLs firing? Are alerts reaching SDRs? Any obvious false positives? Fix any instrumentation bugs.
- Fri (weekly review, 30 min): Full review per feedback loop agenda. Adjust thresholds if PQL volume is too high/low. Review first outreach responses.
Key questions for Week 1:
- Are PQLs firing at the expected rate (2-6/day)?
- Are SDRs meeting the 2-hour SLA?
- Any immediate false-positive patterns to address?
Week 2 (Optimize: Tune Signals and Outreach)
Goal: Refine PQL thresholds based on Week 1 data. Optimize outreach templates based on reply patterns.
- Mon: Implement any threshold changes from Week 1 review.
- Daily: SDRs continue working PQLs. Begin tracking reply rates and meeting rates by email variant.
- Fri (weekly review, 30 min): Review acceptance rate, reply rate, meeting rate. Identify highest-performing outreach variant. Drop or adjust underperforming signals.
Key questions for Week 2:
- What is the PQL acceptance rate? (Target: >= 50%)
- Which email variant gets the best reply rate?
- Any signals that should be added or removed?
Week 3 (Measure: Build Evidence)
Goal: Accumulate enough conversion data to measure impact. Start comparing PQL cohort vs non-PQL cohort.
- Mon: Build Dashboard 2 (Conversion Impact) and Dashboard 3 (Health & Guardrails) if not already live.
- Daily: Business as usual. SDRs continue working PQLs.
- Fri (weekly review, 30 min): Review early conversion data. Check guardrails (spam complaints, NPS, self-serve conversion rate). Identify any scaling concerns.
Key questions for Week 3:
- Is trial-to-paid conversion higher for PQL-contacted users vs baseline?
- Any negative signals (complaints, NPS drop, self-serve conversion decline)?
- Are SDRs at sustainable capacity?
Week 4 (Evaluate: Decide Scale / Iterate / Stop)
Goal: Compile results. Make a go/no-go decision on scaling.
- Mon-Thu: Continue pilot. Final data collection.
- Fri (pilot retrospective, 60 min):
- Present full pilot results (all 3 dashboards).
- Score against success metrics (see below).
- Decide: Scale, Iterate (run another 4-week cycle with changes), or Pause (prerequisites not met).
- Document learnings, threshold changes, and recommendations.
Success Metrics
| Metric | Target | Minimum Viable |
|---|---|---|
| Trial-to-paid conversion (PQL cohort) | >= 15% | >= 12% (meaningful lift over baseline) |
| PQL acceptance rate | >= 50% | >= 40% |
| SLA compliance (first touch within target) | >= 80% | >= 70% |
| Meeting set rate | >= 20% | >= 15% |
| Reply rate (email) | >= 15% | >= 10% |
| Spam/unsubscribe complaints | < 2% | < 5% |
| Self-serve conversion rate (overall) | No decline | No decline > 1 percentage point |
| NPS/CSAT (PQL cohort) | No decline | No decline > 5 points |
Leading Indicators (Track Weekly)
- Time-to-first-touch (are SDRs hitting SLA?)
- PQL volume per day (is scoring calibrated?)
- Reply rate by email variant (which templates work?)
- Meeting set rate (is outreach converting to conversations?)
- Disposition distribution (what % accepted vs rejected?)
Guardrails
| Guardrail | Threshold | Action if Breached |
|---|---|---|
| PQL volume > 10/SDR/day for 3+ days | Tighten score threshold or pause routing | RevOps adjusts scoring |
| Acceptance rate < 30% for 2 weeks | Re-evaluate PQL definition and signals | Product + RevOps + Sales review |
| Spam complaints > 5% | Pause outreach; review messaging | Sales Manager + Legal |
| Self-serve conversion drops > 2pp | Investigate if sales outreach is cannibalizing self-serve | Product + Sales review |
| SDR burnout / SLA compliance < 60% | Reduce PQL volume or add capacity | Sales Manager |
Scale Trigger
Proceed to scale if ALL of the following are true after the 4-week pilot:
- PQL cohort trial-to-paid conversion is >= 12% (meaningful lift over baseline).
- PQL acceptance rate is >= 40%.
- No guardrail breaches in Weeks 3-4.
- SDRs report the workload is sustainable.
- Self-serve conversion rate has not declined.
Scale Plan (Post-Pilot, If Approved)
| Phase | Timeline | Scope | Key Actions |
|---|---|---|---|
| Scale Phase 1 | Weeks 5-8 | Expand to full mid-market ICP (keep same signals/thresholds) | Add 1 SDR if volume requires; automate CRM task creation; build self-serve PQL dashboard for Sales Manager |
| Scale Phase 2 | Weeks 9-12 | Add adjacent segments (e.g., 100-200 employees or 2,000-5,000 employees) | Calibrate new segment thresholds; consider PQA scoring for expansion |
| Scale Phase 3 | Weeks 13+ | Full PLS motion | Integrate with marketing automation; build predictive PQL model; add AE handoff for deals > $X |
Rollback Plan
If the pilot fails to meet minimum viable metrics or guardrails are breached:
- Pause PQL routing (turn off alerts).
- Conduct a post-mortem (what went wrong: signals? outreach? capacity? ICP?).
- Fix the identified issues and re-run a 2-week mini-pilot before attempting full scale.
9) Risks / Open Questions / Next Steps
Risks
| Risk | Likelihood | Impact | Mitigation |
|---|---|---|---|
| Baseline conversion data is unavailable or unreliable | Medium | High -- can't measure lift | Pull conversion data in Week 0; if unavailable, use Week 1-2 non-PQL cohort as concurrent baseline |
| PQL volume is too low (< 1/day) | Medium | Medium -- pilot is inconclusive | Lower thresholds (e.g., require 1/4 signals instead of 2/4) or extend pilot to 6 weeks |
| PQL volume is too high (> 10/SDR/day) | Low | High -- SDRs are overwhelmed, quality drops | Tighten thresholds immediately; prioritize High-priority PQLs only |
| User-to-account mapping fails | Medium | High -- can't filter by ICP company size | Use email domain enrichment as primary; manual enrichment as fallback; exclude unmappable users from pilot |
| Outreach feels spammy to users | Low | High -- brand damage, complaints | Start with conservative templates; monitor complaints weekly; immediate pause if threshold breached |
| SDRs don't adopt the new workflow | Medium | High -- pilot fails due to execution, not strategy | Enablement session in Week 0; daily Slack check-ins in Week 1; Sales Manager accountability |
| Self-serve conversion cannibalization | Low | High -- net negative impact | Monitor self-serve conversion rate weekly; stop if it declines > 2pp |
Open Questions
- What is the exact activation event definition? Need to confirm with Product. Is it "first query run," "first data source connected," or something else?
- What is the current baseline trial-to-paid conversion rate for the ICP segment? Critical for measuring lift. Needs to be pulled from analytics before or during Week 0.
- What enrichment tool is available for company size data? Clearbit, ZoomInfo, or another provider? If none, how do we determine company size at signup?
- What CRM is in use, and what is the current SDR workflow? Need to confirm fields, automation capabilities, and any existing lead routing.
- Are there any compliance or legal reviews required before outreach? E.g., CAN-SPAM, GDPR considerations for EU trial users.
- What is the current volume of ICP-fit trial signups per week? Needed to estimate PQL volume and pilot sample size.
- Is there executive sponsorship for this pilot? Who is the decision-maker for the scale/no-scale decision at the end of Week 4?
Next Steps
| # | Action | Owner | Deadline |
|---|---|---|---|
| 1 | Pull baseline trial-to-paid conversion data (overall + ICP segment) | Analytics / Product | Week 0, Day 1 |
| 2 | Confirm activation event definition and instrumentation | Product / Engineering | Week 0, Day 2 |
| 3 | Confirm user-to-account mapping method and enrichment tool | Engineering / RevOps | Week 0, Day 2 |
| 4 | Build PQL scoring automation | Engineering / RevOps | Week 0, Day 3-4 |
| 5 | Set up Slack alerts + CRM fields + routing | RevOps | Week 0, Day 3-4 |
| 6 | Load outreach templates into email tool | SDRs / RevOps | Week 0, Day 4 |
| 7 | Conduct SDR enablement session | Sales Manager | Week 0, Day 5 |
| 8 | Legal/compliance review of outreach approach | Legal | Week 0, Day 5 |
| 9 | Launch pilot (go live with PQL routing) | RevOps / Sales Manager | Week 1, Day 1 |
| 10 | First weekly review | Sales Manager + Product + RevOps + SDRs | Week 1, Day 5 |
Quality Checklist (Self-Assessment)
A) Scope + readiness
- Objective is explicit (trial-to-paid conversion) and tied to a target segment (mid-market data teams)
- Readiness gate is run (activation definition exists; usage signals identified; user-to-account mapping plan noted)
- Boundaries are clear (PLS is a layer on self-serve; what it will not do is documented)
- Low-touch PLG path remains intact (guardrails documented in Section 2)
B) Funnel + ownership map
- Funnel stages defined with entry/exit criteria
- Intervention points are explicit and justified
- RACI is clear (Product vs Sales vs CS vs RevOps)
- SLAs are defined and feasible given 2-SDR capacity
C) PQL/PQA definition quality
- Qualified unit (PQL) matches buyer/user reality (user-led trial evaluation)
- Definition includes required signals, thresholds, and exclusions
- False-positive controls included (multi-day requirement, bot filtering, weekly threshold review)
- Examples exist for qualified vs not-qualified
D) Signals + routing/SLA fit
- Signals map to intent and value potential (activation, invites, integrations, billing views, multi-day usage)
- Scoring window (10-day) and thresholds are explicit and testable
- Routing rules match capacity (5 PQL/SDR/day cap, round-robin, escalation)
- Disposition taxonomy exists and feeds back into tuning
E) Workflow + feedback loop
- Reps receive enough context (signal summary, suggested talk track, account history)
- Workflow includes: alert -> assignment -> first touch -> disposition -> next action
- Weekly tuning loop exists with defined agenda and owners
- Escalation path exists for missed SLAs and noisy alerts
F) Outreach kit quality + safety
- Templates are short, helpful, and not creepy
- Personalization uses only approved signals
- Each message has one clear ask and a low-friction next step
- No manipulative tactics, deception, or scraping
G) Measurement + pilot readiness
- Pilot is bounded (ICP segment, 4 weeks, estimated sample size)
- Leading + lagging metrics specified with targets
- Guardrails exist (spam complaints, retention, NPS, capacity)
- Scale/rollback triggers defined; rollout plan is actionable
H) Completeness
- Pack ends with Risks / Open questions / Next steps
- Assumptions & unknowns labeled with validation plan
- No requests for secrets or credentials