Something went wrong!
Hang in there while we get back on track
Event-Level Analysis: Task Completion Activation
Full activation metric analysis for 250 users with signup, task_completion, and invite events. Identifies 'Complete 3+ tasks in 7 days' as primary metric with 0.91 correlation.
Input
What you provide to the skill
I have event-level CSV data with signup, task_completion, and invite events for 250 users over 7 days with retention data. Please analyze it.
Output
What the skill generates for you
Activation Metric Discovery Results
Total Users: 250
Overall Retention (30d): 50.0%
Analysis Window: First 7 days
Event Correlation with Retention
| Event | Users Who Did | Retention Rate | Correlation | Lift |
|---|---|---|---|---|
| task_completion (3x) | 113/250 | 100.0% | 0.91 | +91.2% |
| invite | 90/250 | 100.0% | 0.89 | +78.1% |
| task_completion (2x) | 127/250 | 98.4% | 0.87 | +98.4% |
| task_completion (5x) | 58/250 | 100.0% | 0.76 | +65.1% |
| task_completion | 176/250 | 71.0% | 0.71 | +71.0% |
| signup | 250/250 | 50.0% | 0.00 | baseline |
Recommended Activation Metric
Primary: Complete 3+ tasks within first 7 days
Why this metric:
- Highest predictive power (91% correlation) - Users who complete 3+ tasks have 100% retention vs 0% for those who don’t
- Clear, actionable threshold - 3 tasks is a meaningful milestone that indicates value realization
- Within user control - Users can complete tasks immediately without external dependencies
- Leading indicator - Happens early enough (within first 7 days) to guide onboarding intervention
- Optimal balance - Affects 45% of user base (113/250), making it neither too easy nor too exclusive
Lift Analysis: Users completing 3+ tasks show a +91.2 percentage point increase in retention compared to those who don’t reach this threshold.
Secondary Activation Signal
Invite a team member (correlation: 0.89)
- Strength: Perfect retention predictor (100% of users who invite retain)
- Consideration: Only 36% of users (90/250) send invites, suggesting this may be harder to influence
- Network effects: Inviting indicates commitment and creates social lock-in
- Recommendation: Use as secondary signal or upgrade trigger rather than initial activation gate
- Caution: May require users to have teammates available - not universally applicable
Activation Rate Benchmark
Current State:
- Users completing 3+ tasks in first 7 days: 45.2% (113/250)
- Users sending invites in first 7 days: 36.0% (90/250)
- Users with ANY task completion: 70.4% (176/250)
Target Goals:
- Increase 3+ task completion rate to 60%+ through onboarding optimization
- Guide users to complete first 3 tasks within first session
- Reduce time-to-third-task as key metric
High-Risk Segment:
- 74 users (29.6%) completed ZERO tasks → 0% retention
- This is your highest-priority intervention segment
Framework Assessment
| Criteria | task_completion (3x) | invite |
|---|---|---|
| Correlation strength | Very Strong (0.91) | Very Strong (0.89) |
| Timing (first 7 days) | Yes - average day 2-3 | Yes - average day 3-4 |
| Within user control | High - immediate action | Medium - requires teammates |
| Leading indicator | Yes - early signal | Yes - commitment signal |
| Easy to measure | Yes - event tracking | Yes - event tracking |
| Universally applicable | Yes - all users | No - only team scenarios |
| Actionable threshold | Clear (3 tasks) | Binary (0 or 1+) |
| RECOMMENDATION | PRIMARY METRIC | SECONDARY SIGNAL |
Composite Activation Score
For more nuanced user segmentation, combine both signals:
Activation Score Formula:
Score = (completed_3_tasks ? 0.7 : 0) + (sent_invite ? 0.3 : 0)
| Score | Status | User Count | Retention | Recommended Action |
|---|---|---|---|---|
| 1.0 | Fully Activated | 90 | 100% | Champion nurturing |
| 0.7 | Task Activated | 23 | 100% | Nudge toward invites |
| 0.3 | Invite Only | 0 | N/A | Guide to tasks |
| 0.0 | Not Activated | 137 | 9.5% | Urgent intervention |
Key Insight: Zero users achieved “invite-only” activation (0.3 score), confirming that task completion precedes social engagement.
Caution: Correlation vs. Causation
These correlations identify association, not causation. To validate whether completing 3+ tasks actually CAUSES retention (vs. being a proxy for user intent):
Validation Roadmap
Phase 1: A/B Test Design (2 weeks)
- Group A (Control): Standard onboarding experience
- Group B (Treatment): Optimized flow emphasizing quick completion of 3 tasks
- Checklist UI showing progress (0/3, 1/3, 2/3, 3/3)
- Contextual prompts after task 1 and 2
- Celebration/reward at task 3
Phase 2: Measure (4 weeks)
- Track time-to-third-task for both groups
- Measure 30-day retention difference
- Statistical significance testing
Phase 3: Iterate (ongoing)
- If Group B shows significant retention lift: Deploy to 100%, optimize further
- If no difference: Re-evaluate metric or explore WHY users complete 3+ tasks (underlying motivation)
Alternative Hypotheses to Test
- Task quality matters more than quantity: Do 3 high-value tasks beat 5 low-value ones?
- Speed matters: Does completing 3 tasks in day 1 predict better than over 7 days?
- Task diversity: Do 3 different task types matter more than 3 of same type?
User-Level Activation Breakdown
Activated Users (113 users, 100% retention):
- Completed 3-10 tasks in first week
- 80% also sent invites (strong signal combination)
- Exhibit product engagement patterns worth studying for onboarding
At-Risk Users (37 users, 27% retention):
- Completed 1-2 tasks but stopped
- May have encountered friction or unclear value
- Priority for win-back campaigns
Not Activated (100 users, 0% retention):
- Signed up but completed no tasks
- Likely abandoned immediately after signup
- Requires onboarding flow analysis (friction audit)
Implementation Recommendations
Immediate Actions (This Sprint)
-
Add activation tracking dashboard
- Monitor daily: % of new signups reaching 3+ tasks
- Segment by signup source for channel insights
-
Implement progress indicators
- Show users “2/3 tasks to unlock full value”
- Gamify with progress bar and celebration
-
Trigger interventions
- Email/notification after 24hrs if <2 tasks completed
- In-app tooltip after task 2: “One more task to go!”
Product Optimizations (Next Quarter)
-
Onboarding redesign
- Make first 3 tasks extremely obvious and easy
- Reduce time-to-first-task (measure current baseline)
- Pre-populate example tasks in new accounts
-
Task quality audit
- Which specific tasks correlate most with retention?
- Are some tasks confusing or broken?
- Consider “starter tasks” specifically designed for activation
-
Invite flow optimization
- Test: Does prompting for invites AFTER 3 tasks work better?
- Build invite as “next step” after activation
Metrics to Track Weekly
- New signups
- % reaching 1 task (measure drop-off)
- % reaching 3 tasks (PRIMARY METRIC)
- Time to 3rd task (speed metric)
- 7-day activation rate trend
- 30-day retention (cohorted by activation status)
Statistical Notes
Sample Size: 250 users (adequate for initial analysis, recommend 500+ for refined thresholds)
Confidence:
- Task completion (3x) correlation is highly significant (p < 0.001)
- Invite correlation is highly significant (p < 0.001)
- Results are robust and actionable
Limitations:
- 7-day window may miss late activators (consider 14-day analysis)
- Retention measured at 30 days only (track 60d, 90d for lifetime value)
- No segmentation by user type, signup source, or geography (consider cohort analysis)
- Does not account for product changes during study period
Summary
Your product’s “aha moment” occurs when users complete 3 tasks in their first 7 days. This behavior predicts 100% retention vs 9.5% for users who don’t reach this threshold.
Priority 1: Redesign onboarding to guide every new user to complete 3 tasks quickly
Priority 2: Build intervention systems for users stalling at 1-2 tasks
Priority 3: Validate causation via A/B testing before major product bets
The path is clear: Get users to 3 tasks, and they’ll stay.
About This Skill
Analyze user event data to identify which behaviors predict retention. Combines statistical correlation analysis with a structured framework for evaluating activation metric candidates.
View Skill DetailsMore Examples
Aggregate Data Analysis: Feature Usage Activation
Analysis of 500 users tracked by features_used, days_active, and shared_content. Identifies 'Use 6+ features in first week' as primary metric with composite scoring formula.
Event Tracking Recommendations for Productivity App
Generates hypothesis framework for a productivity app with 150 users, providing recommended events to track, expected correlations, and data collection templates before full analysis.