Jake McMahon
Led by Jake McMahon 8+ years B2B SaaS · Behavioural Psychology & Big Data

Cohort analysis for SaaS teams.

Cohort analysis should show how different groups behave after signup, launch, or change so you can see whether behavior is improving, stable, or slipping.

This page is for teams trying to answer:

Which cohorts improve Where retention diverges Which change mattered

Plain English first. Time-based analysis second.

Cohort Analysis, Broken Down

01 — Grouping Which users or accounts belong in the same cohort
02 — Time Which window matters and how long to watch it
03 — Comparisons Which cohorts are better, worse, or changing after a release
04 — Action What the team changes next because the signal is clear
Cohort insight unlock Acquisition month vs behaviour

Splitting cohorts by acquisition month plus first feature adopted surfaces retention patterns that neither dimension reveals alone.

Most misread cohort pattern Early flatness

A retention curve that flattens quickly looks like failure on a DAU basis but often indicates a healthy recurring-use product — the interpretation depends on the expected cadence.

Cohort analysis frequency Monthly minimum

Reviewing retention cohorts less than monthly means missing the inflection points that signal a product or pricing change is working — or isn't.

Why cohort analysis produces noise

"Our cohort chart is beautiful but tells us nothing actionable"

"We have cohort retention going back 18 months. The chart looks great in the board deck. But when a director asks 'what should we do differently to improve this,' there's no answer in the chart. It's descriptive data without a diagnostic layer."

Head of Analytics — B2B SaaS, $30M ARR
"Cohorts by acquisition month don't show us what's different about them"

"We can see that the October 2024 cohort retains at 72% and the February 2025 cohort retains at 59%. But we don't know if that's a product change, a channel mix shift, a pricing change, or a seasonal effect. The cohort doesn't explain itself."

VP Product — SaaS, Series B
"The activation milestone isn't in the cohort, so it's all noise"

"We're cohort-ing by signup date but not by whether they reached activation. Users who activated and users who never did are mixed in the same cohort. The retention curve is averaging across two completely different populations."

Growth PM — PLG SaaS, $18M ARR
"We built cohorts in PostHog but they're measuring the wrong event"

"We're measuring 'logged in' as the retention event. But our product is used for a specific monthly workflow. Of course retention looks terrible on a weekly basis. We need the retention event to match how customers actually use the product, not how the analytics tool defaults to measuring it."

Product Lead — Vertical SaaS, $12M ARR

Cohort analysis is not a pretty chart.

Cohort analysis is the practice of comparing groups over time to see whether change actually sticks. The point is not to make more lines. The point is to make better decisions with less guessing.

A useful cohort analysis setup helps your team answer a small set of questions clearly. Which cohort retained better? Did the new release improve behavior? Which segment changed after the campaign? Are newer users behaving differently from older ones?

When the setup is working, cohort analysis gives product, growth, and leadership the same view of whether change is holding. When it is not working, the team gets average lines, noisy comparisons, and no clear answer.

Most setups answer activity questions, not cohort questions.

The tools are usually there. The gap is between what the team tracks and what the team actually needs to know.

The team tracks averages, not cohort behavior.

Plenty of setups collapse everyone into one trend line. Much fewer are built around meaningful cohorts that show real change.

Dashboards exist, but nobody changes anything because of them.

That usually means the views are descriptive but not decision-ready. The team can observe movement, but not which cohort or change matters.

The cohort key is unstable.

If the grouping changes every month, the chart stops being a diagnostic and starts being decoration.

The setup explains the past, but not the next decision.

Cohort analysis is most valuable when it shortens the time between "something changed" and "the team knows what to do next."

Three signs the setup is actually useful.

01 — Clear Definitions

The team agrees on the cohort definition.

Signup month, account type, campaign source, and segment logic are defined in plain language. Product, growth, and leadership are not using different meanings for the same group.

02 — Trusted Instrumentation

The underlying grouping logic is stable enough to trust.

Filters, timestamps, and cohort windows stay consistent. New instrumentation makes the analysis sharper instead of noisier.

03 — Decision-Ready Views

The dashboards point to a next action.

The team can look at a cohort view and know whether to investigate onboarding, lifecycle changes, or product shifts next.

Start with the group, not the chart.

Most cohort debt starts because grouping was added metric by metric, not question by question.

ProductQuant approaches cohort analysis from the business questions backward. First define the group the team needs to compare. Then map the time windows and filters that answer those questions. Then build the views and QA process that keep the setup usable as the product changes.

That means cohort rules, dashboards, and tooling all serve the same goal: fewer arguments, clearer priorities, and better decisions.

01 — Define

Start with the comparison question

Which cohort, which window, and which outcome. Name what the team actually needs to understand.

02 — Map

Design the grouping layer

Choose the filters and time windows that answer the question without turning the analysis into clutter.

03 — View

Build the right analysis layer

Cohort views, trend charts, dashboards, or segment views should point to a concrete next action, not a reporting ritual.

04 — Run

Keep it usable over time

Ownership, QA, naming discipline, and decision reviews stop the setup from drifting as the product and market evolve.

A cleaner setup means each new cohort is easier to compare than the last one.

Go deeper from here.

These are the most relevant ProductQuant assets if you want implementation detail, retention context, or a clearer cohort foundation.

Client work

Healthcare SaaS — Retention Analysis
13
dashboards including 4 retention-specific cohort views

Cohort System Built: Retention Layer Connected to Activation

Built a retention cohort dashboard that separated accounts by activation milestone, surfacing the behavioural gap between 51% and 84% retention segments and giving CS a clear intervention brief.

Read the case study →
B2B SaaS — Cohort Design
90 days
minimum tracking window to see meaningful cohort patterns

Cohort Analysis Redesign: From Generic to Decision-Ready

Rebuilt a cohort setup that was cohort-ing by signup date without activation segmentation. Separated the population into activated vs non-activated cohorts — revealing that the “retention problem” was almost entirely a non-activation problem.

See the analytics audit →

Pick the step that matches the gap.

This page is educational first. If you want help turning the ideas into a working setup, these are the most relevant ProductQuant paths.

Jake McMahon — cohort analysis consultant

Who does this work

Jake McMahon

Founder, ProductQuant · MSc Big Data & Business Analytics · BSc Behavioural Psychology · 8+ years B2B SaaS

Jake has built cohort analysis systems for B2B SaaS products where the standard setup was producing beautiful charts that nobody could translate into decisions. The work starts from defining the right retention event for the specific product — not defaulting to DAU or session frequency — and builds the cohort dimensions that separate meaningful patterns from noise.

Cohort analysis Retention cohorts Activation milestone tracking Product usage patterns PostHog cohorts Segment analysis B2B SaaS Analytics design

Common questions

Cohort analysis: what it is and what it should produce

Questions about your specific situation? Book a call →

What is cohort analysis?+
Cohort analysis is the practice of grouping users or accounts by a shared characteristic — usually signup date — and tracking their behaviour over time. It shows retention curves, expansion patterns, and whether product improvements are actually working. A cohort that retains better after a product change tells you the change worked in a way that a single-point-in-time metric never can.
How do you read a cohort retention table?+
Rows are cohorts (typically by signup month). Columns are periods since signup (week 1, month 1, month 3). Values are the percentage still active. A flattening curve indicates a stable retained base. A curve that keeps declining means there is no engagement floor — the product has not created durable habit for any segment.
What cohorts matter most for B2B SaaS?+
Four cohort types produce the most actionable signals: (1) signup cohort by month for retention; (2) acquisition channel cohort for LTV comparison; (3) pricing tier cohort for expansion analysis; (4) activation milestone cohort for churn prediction. The activation milestone cohort is often the most diagnostic — it separates accounts that reached value from those that never did.
How do you use cohort analysis to improve product decisions?+
Compare cohorts from before and after a product change. If a post-change cohort retains better at month 3, the change worked. This requires patience — 90 days of data minimum — but it is the most reliable signal available for validating whether a product or onboarding change actually improved retention.
What is the difference between user-level and account-level cohort analysis?+
User-level analysis tracks individual people. Account-level analysis tracks company or team accounts. For B2B SaaS, account-level is usually more meaningful because renewal and churn are account decisions, not individual user decisions. A single power user can inflate user-level retention even when the account is at risk.
How long should you track a cohort?+
Minimum 90 days for activation and early retention patterns. 12 months for a full retention and expansion picture. Subscription products need at least one full renewal cycle to see renewal behaviour — anything shorter is measuring early engagement, not commitment.

Cohort analysis should show whether change stuck.

If your team has charts but still cannot tell whether the change worked, start with the audit or the program.