Skip to main content

Feature Adoption

Updated this week

At a glance: Empowering the Product team to track and assess the performance of in-game features and placements, enabling data-driven optimization.

Using Feature Adoption

  • Evaluate the performance of a newly launched feature without the need for an A/B test

  • Compare performance across specific metrics or placements within an A/B test

  • Monitor the adoption of all placements to gather actionable insights for live operations

  • Deep dive at user-level to understand user behaviours better

  1. Go to Big Data > Feature Adoption

  2. Select the date range for First Open, and optional filters for Day X period, A/B Test, Country, and/or App Version

  3. Next, configure your Metrics

Definition

Name

Any name for Internal reporting reference

Event Name

Events available in Firebase

Condition

Param Key

Param Value

Label or identifier assigned to a configurable value

Specific data assigned to the parameter

Operator

To calculate metric based on COUNT or SUM

Aggregate

To calculate metric based on SUM or AVERAGE

4. You can add Custom Metrics by creating unique calculation formulas for specific metric measurements

Some examples of Custom Metrics:

% of users with >20 swipes

% of uninstalls for a Cohort

5. Once you have set your metrics, you can Run Report.

6. Expand on dimensions (First Open dates, Country, App Version, User IDs) to analyse deep into the event-based performance.

Track performance changes on a daily basis to analyse if your newly implemented app/game changes are taking effect to improve on tracked metrics.


Use Case 1

Scenario

Your team just launched a new Utility app that tracks the sunbathing time to boost vitamin D and improve health.

You wonder:

  • How are users using your app? Do they really use the time tracker?

  • Do they explore the app by visiting different pages like the Stats page?

With Feature Adoption, your team can easily answer these questions.

Apply TARS framework with Feature Adoption (More about TARS framework)

Define T, A, R (and S if possible)

T = All new users who install the app from Day X to Day Y

A = Users who have at least 1 time using the Start Tracking function (goal >80%)

R = Users who come back to the app and use the Tracking function again (goal >30%)

S = Users who give a 5-star rating on the Store

Convert TARS to measurable metrics on Cost Center

T = All new users of selected cohorts

A = Users who have at least 1 click on Start_Tracking

R = Users who have >=2 clicks on Start_Tracking

S = Users who give a 5-star rating on Store (cannot track by events so leave this out)

Gain insights from data

From this, you can conclude that >98% of users use the Tracking function at least 1 time (Adoption), and they keep coming back to use this function (Retention). It’s safe for the team to conclude that your users have engaged well with this core feature of the app.


Use Case 2

Scenario

Your core feature is now well adopted by users, your team wants to start monetizing the app by adding Premium Subscription. You’ve added the Premium button on the right corner of 4 main pages: Today, Stats, Calendar, Setting

You wonder:

  • Do users click on this button much? If yes, where do they click it from?

  • Which placement generates the most subscription revenue?

Convert to measurable metrics on Cost Center

  • Do users click on this button much? → Users who have at least 1 click on the Premium button

  • If yes, where do they click it from? → Users who have at least 1 click on the Premium button split by location

  • Which placement generates the most subscription revenue? → Subscription split by location

Gain insights from data

You can understand if users are interested in this Premium package through % of users that click on the button.

You can view the corresponding revenue generated by these placements to know which is performing well.


TARS Framework

The TARS Framework is a method used to measure and evaluate the performance of product features based on four key metrics: Target, Adoption, Retention, and Satisfaction.

Feature Launch & Iteration - Feature Retention and Satisfaction - Reforge

Target: Measures how well the feature meets the intended objectives or goals it was designed to achieve.

Adoption: Tracks how many users are engaging with the feature meaningfully

Retention: Assesses whether users continue to engage with the feature over time, indicating its long-term value.

Satisfaction: Evaluates user feedback and satisfaction, providing insights into the overall user experience with the feature.

Did this answer your question?