The Synerise AI Developer Hub

Welcome to the Synerise AI developer hub. You'll find comprehensive guides and documentation to help you start working with Synerise AI API as quickly as possible, as well as support if you get stuck.
Let's jump right in!

A/B/X tests


A/B/X testing is a method to compare two, or more, versions of a webpage or app. They are compared against each other. The test is that you show one version of the webpage or app to some percentage and another version, or other versions, depending on how many you are testing, to other percentages of the visitors or users. Then thanks to analytics you can discover which version gives the best conversion or generates the highest revenue, or that there is no difference between the versions.
The advantages of this test are that:

  • The versions are tested at the same time, so the external influences will not distort the results.
  • The results are based only on the preferences of your website visitors or app users. There is no guessing involved in the decision which version to choose.

Creating A/B/X tests

To conduct an A/B/X test you must have a previously created campaign, or you can duplicate an existing one.
Then you choose the Campaigns option from the Synerise platform dashboard. Next from the four options you choose A/B/X tests.

The page with existing tests will show up, click the "Create a new A/B/X test" button.

To create the test, you need to provide the test Name, a campaign for group A and a campaign for group B and then choose what percentage of your audience should see campaign A and what percentage campaign B.

To add another group and assign it another campaign press “Add a new campaign” and repeat the steps analogically as for group A and B. Now you can adjust the percentage for all three groups. If the sum of percentages exceeds 100% you will be informed.

The randomizing option will be Sequential, which means that a new visitor will be randomly assigned to a group and from time to time the percentage for each group will be controlled, and the random assigning can be adjusted to keep the exact distribution that was specified in the settings.
In the last step you must choose the start date and you may or may not specify the end date.
When you are done setting, hit save.

A/B/X tests

Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.