Retention is the most important metric for product-led growth. It doesn't matter how many users you acquire if they don't stick around. Cohort analysis in PostHog helps you understand when users drop off, why they stay, and what behaviors predict long-term engagement.

What Is Cohort Analysis?

A cohort is a group of users who share a common characteristic, usually when they signed up. Cohort analysis tracks how these groups behave over time.

Instead of asking "How many DAUs do we have?" (which mixes new and old users), cohort analysis asks "Of the users who signed up in January, how many are still active in March?"

This separation is crucial because:

  • New users behave differently than established users
  • Product changes affect different cohorts differently
  • Retention trends are hidden in aggregate metrics
  • Growth can mask declining retention if you're acquiring users faster than you're losing them

Types of Cohorts in PostHog

Time-Based Cohorts

Group users by when they first appeared (signup date, first event). This is the most common approach for retention analysis.

  • Daily cohorts: Users who signed up on specific dates (best for high-frequency products)
  • Weekly cohorts: Users who signed up in Week 1, Week 2, etc.
  • Monthly cohorts: January signups, February signups, etc.

Behavioral Cohorts

Group users by actions they've taken:

  • Users who completed onboarding
  • Users who invited a teammate
  • Users who used feature X
  • Users who upgraded to paid
  • Users who connected an integration

Property-Based Cohorts

Group users by attributes:

  • Enterprise vs. SMB customers
  • Users from organic vs. paid acquisition
  • Mobile vs. desktop users
  • Geographic regions
  • Industry or company size

Setting Up Retention Analysis

Step 1: Define Your Retention Event

What does "retained" mean for your product? Common definitions:

  • Any activity: User performed any event
  • Core action: User performed the key value action (sent a message, created a report, etc.)
  • Session: User started a new session

Choose the event that indicates real engagement, not just logging in. The retention event should reflect actual value delivery—for example, "created a report" is typically better than "viewed dashboard" because it shows the user accomplished something meaningful.

Step 2: Choose Your Timeframe

This depends on your product's natural usage frequency:

  • Daily retention: Social apps, games, communication tools
  • Weekly retention: Productivity tools, most B2B SaaS
  • Monthly retention: Financial apps, low-frequency tools, HR software

Consider your product's "job to be done"—if users need your product weekly to accomplish their goals, measure weekly retention.

Step 3: Build in PostHog

  1. Go to Retention in the sidebar
  2. Set your cohort criteria (first seen, first event, etc.)
  3. Set your retention event
  4. Choose daily, weekly, or monthly buckets
  5. Apply any filters (plan type, acquisition source, etc.)

Reading the Retention Table

PostHog shows retention as a triangle table:

  • Rows: Each cohort (e.g., "Week of Jan 1")
  • Columns: Time periods after first event (Week 0, Week 1, Week 2...)
  • Cells: Percentage of cohort still active

A typical SaaS retention curve (these vary significantly by product type):

  • Week 0: 100% (by definition)
  • Week 1: 40-60% (early drop-off)
  • Week 4: 20-30% (stabilizing)
  • Week 12: 15-25% (long-term retention)

Note: These benchmarks vary widely. According to industry research, a Month-1 retention rate above 35% is considered acceptable for SaaS, with top-quartile B2B companies achieving significantly higher rates as they scale.

Key Retention Metrics

D1, D7, D30 Retention

Classic metrics for consumer products:

  • D1: % returning on day 1 (immediate value delivery)
  • D7: % returning by day 7 (habit formation)
  • D30: % returning by day 30 (sustained engagement)

Week-over-Week Retention

Better for B2B products:

  • W1: % active in week 1
  • W4: % active in week 4 (one month)
  • W12: % active in week 12 (three months)

Rolling Retention vs. Bracket Retention

Rolling retention: User was active on Day N OR ANY day after

Bracket retention: User was active specifically during period N

Rolling retention gives higher numbers and is better for products with irregular usage. Bracket retention is stricter and better for products expecting regular engagement.

Revenue Retention Metrics

For subscription businesses, also track:

  • Gross Revenue Retention (GRR): Revenue retained excluding expansion (target: 85-95%)
  • Net Revenue Retention (NRR): Revenue retained including upsells (target: 100%+)

Finding Retention Drivers: The "Aha Moment"

The magic of cohort analysis is comparing cohorts to find what drives retention. Growth teams often identify "magic numbers" or activation metrics that correlate with long-term engagement.

Famous Activation Metrics

These examples illustrate how successful companies identified key early behaviors:

  • Slack: Teams that sent 2,000 messages showed 93% long-term retention
  • Facebook: Users who added 7 friends in 10 days were far more likely to become engaged long-term (this became the growth team's "single sole focus")
  • Twitter: Users who followed at least 30 accounts became "active forever"
  • Dropbox: Users who placed at least one file in a Dropbox folder

Important caveat: These numbers aren't magical thresholds—they're memorable guidelines that help align entire teams around a clear objective. The specific numbers matter less than understanding why these behaviors predict retention. For Facebook, having 7 friends means your News Feed has fresh content each time you log in. For Slack, 2,000 messages means the team has adopted the product into their workflow.

Compare Behavioral Cohorts

Create two cohorts:

  1. Users who completed onboarding
  2. Users who did NOT complete onboarding

Compare their retention curves. If onboarding completers have 2x better retention, you know where to focus.

The Activation Analysis Process

Find your "aha moment" by testing which early behaviors predict long-term retention:

  1. Define "retained" (e.g., active in week 8)
  2. Look at what retained users did in week 1
  3. Compare to what churned users did
  4. Identify the differentiating behaviors
  5. Validate causation: Run experiments to confirm the behavior actually causes retention, not just correlates with it

Correlation vs. Causation

A critical mistake is assuming correlation equals causation. Just because users who send 100 messages retain better doesn't mean forcing users to send 100 messages will improve retention. The behavior might just be a signal of already-engaged users.

To validate causation:

  • A/B test interventions that push users toward the behavior
  • Measure if the intervention actually improves retention
  • Look for a logical causal explanation (does the behavior directly create value?)

Segment by Acquisition Source

Compare retention by how users found you:

  • Organic search vs. paid ads
  • Product Hunt launch vs. steady-state
  • Referral vs. direct
  • Content marketing vs. sales outreach

This reveals which channels bring high-quality users, not just volume.

Acting on Retention Data

Improving Early Retention (D1-D7)

Focus on first-run experience:

  • Reduce friction in onboarding
  • Deliver value immediately (time-to-value should be as short as possible)
  • Guide users to activation moment
  • Send timely re-engagement emails
  • Use progressive onboarding instead of overwhelming users

Improving Mid-term Retention (W2-W4)

Focus on habit formation:

  • Build features that create routines
  • Add social/team elements
  • Implement notifications for key triggers
  • Introduce secondary features after primary adoption
  • Create "habit loops" with triggers, actions, and rewards

Improving Long-term Retention (W8+)

Focus on expanding value:

  • Ensure product grows with user needs
  • Build integrations with their workflow
  • Create switching costs (data history, team adoption, customization)
  • Regular feature releases and communication
  • Proactive customer success outreach

Common Retention Analysis Mistakes

  1. Measuring too early: Looking at D1 retention when your product has a weekly use case
  2. Ignoring cohort size: Small cohorts produce noisy, unreliable data
  3. Conflating user and revenue retention: You can retain users while losing revenue (or vice versa)
  4. Not segmenting: Aggregate retention hides important segment-level trends
  5. Optimizing for correlation: Pushing users toward behaviors without validating causation
  6. Single "aha moment" fallacy: Different user segments may have different paths to value

PostHog Retention Best Practices

  1. Save cohorts for reuse: Create saved cohorts for key segments you analyze repeatedly.
  2. Dashboard your retention: Add retention charts to a team dashboard for ongoing monitoring.
  3. Set alerts: Get notified if retention for a cohort drops below threshold.
  4. Combine with funnels: Use funnel analysis to understand WHERE users drop off.
  5. Use session recordings: Watch recordings of churned users to understand WHY they left.
  6. Export for deeper analysis: For complex cohort analysis, export to your warehouse.
  7. Track multiple retention definitions: Monitor both "any activity" and "core action" retention.

Building a Retention-Focused Culture

Retention analysis is most powerful when it informs company-wide decisions:

  • Product: Prioritize features that improve activation and retention
  • Marketing: Double down on channels that bring high-retention users
  • Sales: Focus on customer profiles that retain best
  • Customer Success: Intervene early with at-risk cohorts

Retention is the foundation of sustainable growth. Every percentage point of retention improvement compounds over time. Use cohort analysis to find the levers, validate causation through experiments, and then pull them relentlessly.