Impact Report

Mixpanel’s Impact report measures the effects of product or marketing launches on your key metrics. Impact calculates the user adoption of the launch, the impact of the launch on an important event, and the differences between users that adopt the launch and those that do not. 

Impact controls for many external factors to increase result reliability and it presents statistical confidence to help inform further decisions.

Impact is currently in open BETA and is subject to change.

Access Report

To access Impact, click on Analysis in the top navigation, then select Impact.


Build a Query

To build an Impact query, first select a launch event. This is the event that you are measuring as the cause of change.

Launch_Event.pngAdd any additional filters to narrow the launch event parameters by selecting the filter button. 


Select a metric event. You are measuring the impact of the launch event on this metric event. Add additional filters to narrow the metric event parameters.


Select the time range. The time range is a fixed period of time that determines the adopters and non-adopters of the launch event. For example, if the time range is set at 30 days, then that user has 30 days to perform both the launch event and metric event to be considered an adopter.

The chart will not necessarily change if you change the time range.

You can also apply a global filter from the same location. 


Report Calculation Details

The report presents the results in plain english, in an Impact table that includes the Impact values, and in a chart that displays the impact of the launch over time. 

Adopters are users that have done the launch event. Non-adopters are users that have not done the launch event. Both adopters and non-adopters must perform the metric event to be included in the report.  

The Adoption Rate is the number of users that perform the launch event divided by the total user count:

\[\%\,adoption = average\left( (adopters) \over (adopters + non\,adopters) \right)\]

Impact Chart

The Impact Chart shows how the rate of metric event occurence changes over time.  The y-axis of the chart is the average number of the metric event count and the x-axis is time spanning 30 days. 

Unlike other Mixpanel charts, the Impact Chart displays time in relative time, not calendar time.  The chart centers around the first day that the launch event is available, or “day zero”. The chart displays the 15 days before and after day zero. 


Every user in the report can have a different day zero.  For users in the adopter group, day zero is the day that they perform their first launch event.  For users in the non-adopter group, day zero is the day the first adopter performed the launch event (which is most likely the launch day of the feature).

Each data point on the chart is calculated as follows:

\[metric\,event\,rate = { (metric\,event\,count) \over (total\,user\,count)}\]

Each point on the line is the average number of times users in the group did the metric event on that relative day.

For example, if 10 adopters did the launch event for the first time five days ago, and they then did the metric event a total of 30 times today.  The +5 days point on the adopter line would read 3. The math would be as follows:

  • 30 = The number of metric events performed by the adopters on day 5
  • 10 = The number of users who did the launch event 5 days before the first day
  • 3 = 30/10

You can see how frequently users in each group perform the metric event, both before and after the launch.

Impact Table

The Impact Table summarizes the results of the Impact chart.  It displays the average rate at which users in the adopter and non-adopter groups performed the metric event over the course of the 15 days before and after launch.


Each of these four averages is also reflected on the chart as a horizontal dotted line.


The Impact Table compares users that performed the launch event and those that did not. For both groups of users, Mixpanel calculates the average number of times per day that users performed the metric event before the launch, after the launch, and the difference between the two (the delta). 

The average number of times that users performed the metric event before the launch is calculated by:

\[group\,average\,value = average\left( (group\,metric\,event\,count \div group\,user\,count) \over (day\,count) \right)\]

The average rate is calculated before the first day of the launch event, after the launch event, and the report also displays the difference between those rates.


Lastly, the Impact is calculated by comparing the results of the adopters and non-adopters. These results are presented as an overall delta with a corresponding confidence score. 

The overall delta is calculated by subtracting the pre-launch delta from the post-launch delta.

The confidence score is a measurement of statistical significance, and how likely it is that that the overall delta is statistically significant.

Interpret the Results

In general, when the overall delta is positive, and the confidence score is 95% or more, it indicates a successful launch.  The most successful launches will see the rate at which adopters perform the metric event increase post-launch, while the same rate for the non-adopters remains relatively constant.  

In the Impact chart, look for the average gap in usage between the adopters and non-adopters to increase after the launch.  Typically, it is preferable for the increase in the gap to be mostly driven by an increase in the rate at which adopters perform the metric event, rather than a decrease in the rate at which non-adopters perform it. 


Is this article helpful?



Please sign in to leave a comment.