A/B Experiments on App Pages in RuStore Console
The "Page Experiments" feature allows you to conduct A/B testing for app pages. It lets you create two versions of the page (screenshots, videos, descriptions), distribute traffic between them, gather statistics, and determine the most effective variant.
To access this feature, log in to the RuStore Console and:
- Open the Applications menu and navigate to the App Information section.
- Go to the Page Experiments tab.
- On this tab, you will see a Create button prompting you to start your first experiment.

If experiments already exist, users will see information about ongoing and previous experiments, along with a Create button to initiate a new one.
With moderation and flexible variant settings, publishers can effectively test and increase their app's conversion rate on RuStore.
Creating an Experiment
- Click the Create button and enter the experiment name.
- Set experiment parameters.

- In variant B (experimental version), you can replace the short and full descriptions, screenshots, and video link.
- After completing the settings, click Submit for moderation. The experiment status will change to Under moderation.
Moderation and Experiment Statuses
- Under moderation: After submission, data is reviewed by RuStore moderators.
- Published (Active): The experiment is running, and traffic is distributed according to the set percentages.
- Calculating Results: After the experiment ends, the system calculates results, which become available for analysis.
- Moderation failed: The experiment didn't pass moderation; corrections are needed before resubmission.
- Rejected: The experiment was rejected; you need to make adjustments and resubmit it.
- Completed: The experiment stops automatically when it reaches the time limit or the specified number of events, or manually. Statistics remain available for analysis.
Viewing Results
After launching the experiment, information on each variant appears:
- Number of views, app installations, and events;
- Current conversion rate (if corresponding metrics are available).

On the detailed experiment page, you can see the settings for each version, along with performance metrics over time and moderator comments.
Early Completion
You can end the experiment early by clicking End early.
The system will then ask which variant to set as the primary one.
After confirmation, the experiment status changes to Completed.

When the experiment ends, including auto-completion, the system continues recalculating results for an additional day due to delayed event tracking. Upon automatic completion, you have the option to roll out variant B as the primary version if you find it more effective, although by default, variant A is applied. Note: This rule applies only to the last active experiment.
Restrictions
- Experiments do not block creating new app versions: You can continue releasing updates or editing fields not involved in the experiment.
- If changes are needed to fields participating in the experiment (short and full descriptions, screenshots, and video links), stop the experiment to avoid skewing results.
- During an active experiment, you cannot publish a "partial new version" (i.e., a release targeted at a subset of users), as this breaks the correct audience distribution logic. To do a partial release, you must first end the experiment.
- For app versions created before the experiment's launch and marked as Ready for publication, fields will be overwritten according to the experiment's parameters.
- Experiment management is available to users with the roles: owner, administrator, release manager, and developer. See details in User Roles.
- Currently, experiments are unavailable for TV versions.