|Help Centre

Optimizely integration

Last updated at

 

Optimizely is an A/B testing tool which allows you to display different content to your website visitors and analyze which of your experiments works the best for you.

These one-off experiments are called A/B tests, as opposed to feature tests that run on features you've already flagged. With A/B tests, you define two or more variation keys and then implement a different code path for each variation.

From the Optimizely interface, you can determine which users are eligible for the experiment and how to split traffic between the variations, as well as the metrics you'll use to measure each variation's performance.

 

Easy to activate

1. Log in to Freespee, open the Workflow tab and search for the Optimizely integration in the integration tab.

 

Screenshot_2021-12-15_at_13.48.29.png

 

 

2. Find the Optimizely integration in overview and click to setup.

 

Screenshot_2021-12-15_at_13.45.43.png

 

 

3. Enable the Optimizely integration by simply clicking Save changes.

Enter your Event ID:

 

Screenshot_2021-12-15_at_13.44.36.png

 

4. Go to your experiment in Optimizely to retrieve your Metric ID:

 

To continue the next few steps you will need your Metric ID which you can find in your experiment in Optimizely. 

 

In the next few steps we explain how you can create a new experiment and how you can retrieve this Metric ID on your end.

 

5. Select A/B Test in your project:

In the Experiments tab, click Create New Experiment and select A/B Test.

 

experiments.png

 

 

6. Set an experiment key:

  • Specify an experiment key.
  • Your experiment key must contain only alphanumeric characters, hyphens, and underscores. The key must also be unique for your Optimizely project so you can correctly disambiguate experiments in your application.
  • Don’t change the experiment key without making the corresponding change in your code.

 

new_test.png

 

7. Set experiment traffic allocation:

 

The traffic allocation is the fraction of your total traffic to include in the experiment, specified as a percentage.

 

You can stick with the default 50% 50% split that Optimizely sets you up with, or you can increase the traffic allocation to get to statistical significance faster.

 

traffic.png

 

 

 

8. Set variation keys and traffic distribution: 

 

Variations are the different code paths you want to experiment on. Enter a unique variation key to identify the variation in the experiment and optionally a short, human-readable description for reporting purposes.

 

You must specify at least one variation. There’s no limit to how many variations you can create.

 

Use the Distribution Mode dropdown to select how you distribute traffic between your variations:

  • Manual - By default, variations are given equal traffic distribution. Customise this value for your experiment's requirements.
  • Stats Accelerator - To get to statistical significance faster or to maximise the return of the experiment, use Optimizely’s machine learning engine, the Stats Accelerator.

 

Distribute.png

 

9. Add a Metric: 

 

Add events that you’re tracking with the Optimizely SDKs as metrics to measure impact. Whether you use existing events or create new events to use as metrics, you must add at least one metric to an experiment. To re-order the metrics, click and drag them into place.

 

The top metric in an experiment is the primary metric. Stats Engine uses the primary metric to determine whether an A/B test wins or loses, overall.

 

metric.png

 

10. Complete your experiment setup:

 

Click Create Experiment to complete your experiment setup.

 

11. Implement the code sample into your application:

 

Make sure the Optimizely and Freespee scripts are both installed on all your web pages.

Once you've defined an A/B test, you'll see a code sample for implementing it in your application.

 
experiment_code.png
For each A/B test, you use the Activate method to decide which variation a user falls into, then use an if statement to apply the code for that variation. See the example below.

 

Screenshot_2021-12-15_at_12.43.57.png

  • Evaluates whether the user is eligible for the experiment and returns a variation key.
  • Sends an event to Optimizely to record that the current user has been exposed to the A/B test. 
  • If any of the conditions for the experiment aren't met, the response is null. Make sure that your code adequately handles this default case. In general, you'll want to run the baseline experience. 

 

 

All done !

 

How it works

  • Freespee captures Optimizely identifiers for each web visitor and serves the visitor with a call extension
  • When a call is made Freespee sends back conversion data to Optimizely
  • A goal is registered in the Optimizely platform.

 

What we send:

  • Optimizely Project ID
  • Optimizely Experiments Data
  • Optimizely Segments Data
  • Optimizely End User ID
  • Optimizely Goal ID (if set)
  • Call value (if exists)

 

10. For more information please refer to the Optimizely A/B test article:

https://docs.developers.optimizely.com/full-stack/docs/run-a-b-tests 

 

 

Getting Started

Can't find what you're looking for?