Every B2B company tracks churn, but far fewer actually analyze it.
There's a difference. Tracking churn means watching the number tick up or down each month and reacting. Analyzing it means understanding why customers leave, when they decide to leave, and what was happening in their journey weeks before they ever hit cancel.
If you're a CS or implementation leader, that distinction matters more than any retention playbook. Because the answer to why your customers churn is almost always hiding in the data, you just need a framework to find it.
This guide walks through how to run a churn rate analysis that produces insights you can actually act on: the formulas, the segmentation methods, the behavioral signals, and the part most teams skip entirely, connecting what the data shows to where onboarding went wrong.
Churn rate analysis is the process of systematically examining when customers leave, how their behavior changed leading up to that decision, and which variables best predict future churn.
It's distinct from simply calculating a churn rate. A churn rate is a single number. Churn rate analysis is the structured investigation behind it — segmenting by cohort, by customer profile, by lifecycle stage, by behavior pattern — to answer questions that a monthly percentage can't answer on its own.
Most teams stop at the number. They calculate monthly churn, compare it to last quarter, feel either relieved or anxious, and move on. This is the wrong approach.
Why it fails: A churn rate in isolation doesn't tell you who is leaving, when in their lifecycle they're leaving, or what triggered it. It's like checking your blood pressure without knowing what you ate or how you slept. The metric exists, but the insight doesn't.
Effective churn rate analysis answers:
Answer those questions and you have something actionable. Skip them and you're just watching a number.
Before you can analyze churn, you need to measure it correctly. There are several variants, and using the wrong one distorts everything downstream.
Formula: (Customers lost during period ÷ Customers at start of period) × 100
This is the baseline, it tells you what share of your customer base left in a given window. Most teams calculate this monthly and annually.
Example: You start the month with 500 customers and end with 474. Customer churn rate = (26 ÷ 500) × 100 = 5.2%
Customer churn treats every account equally, which creates a blind spot: losing one $150K enterprise account looks identical to losing a $5K SMB customer. That's where revenue churn comes in.
Formula: (MRR lost from churned customers ÷ MRR at start of period) × 100
Revenue churn weights departures by their financial impact. For most B2B SaaS businesses, this is the metric that matters to leadership, investors, and board decks.
Example: You start with $500K MRR. That month you lose $18K in MRR from churned customers. Revenue churn rate = (18,000 ÷ 500,000) × 100 = 3.6%
Gross churn measures pure losses — no offsets.
Net churn subtracts expansion revenue (upsells, seat expansions) from the same period.
A company can have 5% gross revenue churn but achieve negative net churn if expansion revenue from existing customers exceeds losses. That's the holy grail for CS teams and it's achievable only if you're retaining and growing the right accounts.
If you track monthly churn, convert to annual with this formula:
Annual churn rate = 1 – (1 – Monthly churn rate)^12
Why it matters: a 5% monthly churn rate sounds manageable. Annualized, it means you lose nearly 46% of your customer base every year. The math compounds, and it changes how urgently you treat retention.
Before you pull a single report, align on your churn definition. What constitutes a churned customer in your business?
Common definitions include:
Document your definition, make it consistent across teams, and handle edge cases explicitly: trial users, seasonal accounts, temporary pauses, and partial downgrades.
Without a shared definition, your churn analysis will produce numbers that Finance, CS, and Product will all disagree on.
Overall churn rate is a blended average that hides more than it reveals. Segment your data before calculating so you can compare like with like.
Useful segmentation dimensions:
This last segmentation, onboarding completion, is one of the most revealing analyses you can run. In our experience, customers who don't complete structured onboarding churn at dramatically higher rates than those who do. The gap typically surfaces around month 3–6.
Rather than comparing different customer pools month-to-month, cohort analysis tracks the same group of customers over time. This produces far more reliable churn signals.
Create cohorts based on: the month customers first activated, or the month they completed onboarding. Then plot what percentage of each cohort remains active at 30, 60, 90, 180, and 365 days.
What you're looking for:
Churn rarely happens suddenly. It shows up in usage data weeks before a customer ever hits cancel. Connect your product analytics to your churn analysis to find leading indicators.
Behavioral signals that consistently precede churn:
The most predictive signals aren't always the obvious ones. Payment behavior changes, feature breadth decline, and CSM contact frequency often outperform satisfaction scores as early churn predictors.
Plot when churn happens relative to the customer lifecycle. Is churn concentrated in the first 90 days? At the 6-month mark? Just before renewal?
Each timing pattern points to a different root cause:
Once you know when churn peaks in your customer journey, you can focus intervention resources on that specific window instead of spreading them evenly across the lifecycle.
Benchmarks give your annual churn analysis context. Here's where the industry stands in 2026, based on data from Recurly, Vitally, ChartMogul, and SaaS Capital:
|
Segment |
Monthly Churn |
Annual Churn |
|---|---|---|
|
Enterprise (>$1M ACV) |
0.5-1.5% |
5-8% |
|
Mid-Market |
1.5-3% |
15-25% |
|
SMB |
3-7% |
30-55% |
|
B2B median |
~3.5% |
~35% |
|
B2C / consumer |
6.5-8% |
55-65% |
A few important caveats:
"Good" churn is relative to your segment. A 2% monthly churn rate is catastrophic for enterprise SaaS but acceptable for high-velocity SMB. Benchmark against your direct peer group, not the overall SaaS average.
Voluntary vs. involuntary churn require different responses. Per the 2025 Recurly Churn Report, the B2B SaaS median splits into 2.6% voluntary (customer-initiated) and 0.8% involuntary (payment failures, billing issues). Up to 40% of total churn is involuntary — which means a significant portion of your churn is preventable through payment recovery processes alone, with zero product changes required.
Net Revenue Retention is the metric that matters most. Companies with NRR above 110% can grow from their existing customer base even while losing some accounts. If your NRR is below 100%, you're running on a treadmill — every new customer just replaces one you lost.
When you run a proper churn analysis, these are the patterns that consistently show up in the data:
1. The onboarding dropout cliff
A sharp drop in engagement between day 14 and day 45. Customers who don't complete their initial onboarding milestones are 2–3x more likely to churn in the first 90 days. This is the most actionable signal: it's early, it's identifiable, and it's fixable before the customer is mentally gone.
2. The feature adoption gap
Churned customers typically use 1–2 features; retained customers use 4–5. A narrowing feature breadth, especially a decline after initial enthusiasm, is a reliable churn precursor. Your analysis should reveal which specific features separate retained customers from churned ones. Those features define your onboarding priorities.
3. The silent account
Logins drop. No support tickets. No CSM engagement. This looks like a healthy, self-sufficient customer, but it's often the opposite. Customers who go quiet aren't satisfied; they've mentally checked out. Churn analysis of these accounts usually shows they never found their first meaningful value milestone.
4. The 30-day renewal window
Churn concentrations around renewal dates, especially in the 30–60 day pre-renewal window, signal that the relationship isn't strong enough to make renewal automatic. These customers needed a reason to stay and didn't find one. Often they were never properly re-engaged after initial onboarding.
5. The support-intensity spike
Three or more support tickets within a 30-day window, especially when issues go unresolved, predicts churn with high reliability. The pattern isn't just the frustration — it's that high support volume indicates the customer never internalized how to use the product independently.
Here's the insight most churn analyses miss entirely: the majority of churn signals trace back to something that happened, or didn't happen, in the first 90 days.
The decision to churn is rarely made at renewal. It's made much earlier, often in the first few weeks when a customer realizes they're not getting the value they expected. By the time CS is scrambling with a "save play," the customer has already mentally moved on. They're just waiting out the contract.
When you run your cohort analysis and map churn timing, look at:
What did churned customers' onboarding look like?
In almost every analysis, you'll find the same pattern: customers who experienced structured, milestone-driven onboarding, with clear accountability, visible progress, and CS engagement at the right moments, churn at significantly lower rates than those left to figure it out on their own.
This isn't correlation. The mechanism is clear:
Your churn data is a diagnostic for your onboarding process. The accounts that churned are telling you exactly where the onboarding experience failed to deliver. The job of churn analysis isn't just to count the losses, it's to understand what the losses reveal about the process that was supposed to prevent them.
Once your analysis surfaces the patterns, translate them into specific onboarding improvements to prevent churn:
If churn peaks in days 0–30: Your kickoff process isn't creating urgency or momentum. Look at whether customers have a concrete, visible plan from day one. Are milestones clear? Are there reminders and nudges when they go dark? Is the first value moment defined and actively driven?
If churn correlates with low feature adoption: Your onboarding isn't connecting features to customer outcomes. Customers don't churn because they don't know how to use a feature — they churn because they don't see why the feature matters to them. Onboarding that maps specific features to specific customer goals changes this.
If churn is concentrated in specific CSM books: You have a capacity or playbook problem. Some CSMs are successfully driving customers to value; others aren't. Churn analysis at the CSM level reveals this — and lets you identify the practices of your top performers and replicate them.
If churn timing clusters around the 3-month mark: Customers launched but didn't actually adopt. This is the "onboarding completed, customer not activated" problem. The fix is extending structured engagement past the go-live milestone, not just marking the project complete.
If churn is high for specific customer segments: Your onboarding isn't segmented to match. A 500-person enterprise customer and a 20-person SaaS startup don't need the same onboarding path, the same milestone sequence, or the same CSM touch model. If you're running one-size-fits-all onboarding, your churn data will reflect it.
OnRamp is built specifically to address this: structured, customer-specific onboarding plans with milestone tracking, automated nudges, and CS visibility into which accounts are falling behind — before they churn. If your churn analysis is surfacing onboarding gaps at scale, see how OnRamp addresses them.
One of the most important segmentations in churn analysis is also one of the most overlooked: separating voluntary from involuntary churn. They look the same in your churn rate. They require completely different responses.
Voluntary churn is customer-initiated. The customer made an active decision to leave — usually because of perceived low value, a better competitive alternative, budget cuts, or an unresolved product or relationship issue. This is the churn that your CS and onboarding process should prevent.
Involuntary churn happens without a customer decision; failed payments, expired credit cards, billing errors. Per 2025 benchmark data, this accounts for up to 40% of total SaaS churn. It's preventable with payment recovery tooling (dunning flows, card updater services, retry logic) and has nothing to do with product satisfaction.
If you're losing 5% monthly, and 40% of that is involuntary, you have a billing ops problem, not a product or CS problem. Fix the billing flow before you build a retention playbook.
Churn analysis isn't a one-time project. The companies with the lowest churn rates run it as a recurring operational discipline:
Monthly: Calculate customer churn, revenue churn, and gross vs. net churn. Flag any cohort or segment moving outside its normal range. Review involuntary churn recovery rates.
Quarterly: Run a full cohort analysis. Identify behavioral patterns across churned accounts from the quarter. Compare churn timing across the customer journey. Surface the top 3 onboarding or CS process improvements the data suggests.
Annually: Run a year-over-year cohort comparison. Benchmark against industry data. Assess whether changes made in response to previous analyses are improving retention trends.
One additional practice worth building: an exit interview process. Quantitative churn analysis tells you what happened. Exit interviews tell you why, in the customer's own words. Even a 20% response rate on exit surveys will surface context that cohort analysis alone can't provide.
Churn rate is a single metric — the percentage of customers who left in a given period. Churn analysis is the structured investigation behind that number: segmenting by cohort, mapping behavioral signals, identifying timing patterns, and diagnosing root causes.
For enterprise B2B, below 1% monthly (or ~8% annually) is the benchmark. For mid-market, 1.5–3% monthly. For SMB, 3–7% monthly is common. The most important benchmark isn't the industry average — it's your own trend over time and your net revenue retention.
Most companies track churn monthly. A full cohort-level churn analysis should run quarterly. Annual analyses should include year-over-year cohort comparisons and industry benchmarking.
Product usage data (logins, feature usage, session data), subscription/billing records, CRM data (CSM activity, health scores), support ticket volume, and onboarding completion data. Combining these sources gives you behavioral context that billing data alone can't provide.
Cohort analysis tracks a defined group of customers (e.g., everyone who signed up in January 2024) over time to see what percentage remains active at each interval. It's more accurate than period-over-period churn comparisons because it tracks the same group, eliminating distortions from customer base growth or shrinkage.
Gross churn measures pure customer or revenue losses in a period. Net churn subtracts expansion revenue (upsells, upgrades) from existing customers. A company can have positive gross churn but negative net churn — meaning it's making more from existing customers even while losing some.
Onboarding is the most significant predictor of first-year churn. Customers who complete structured onboarding and reach early value milestones churn at substantially lower rates than those who don't. Churn analysis typically reveals that a disproportionate share of churned accounts failed to complete onboarding or never reached their first meaningful product outcome.