Integrate Contentsquare with Optimizely Web Experimentation

Loading...

Integrate Contentsquare with Optimizely Web Experimentation

Contentsquare provides heatmaps, session replays, and journey analysis that reveal how visitors interact with your pages. Integrating Contentsquare with Optimizely Web Experimentation lets you segment all of that behavioral data by experiment variation, so you can see exactly how each variation changes user behavior beyond conversion metrics.

This guide covers the full integration using Optimizely's Custom Analytics Integration (JSON plugin) and Contentsquare's Dynamic Variables API. You will configure Dynamic Variables, build the integration JSON (which includes the track_layer_decision callback), and validate that data flows correctly.

How the Integration Works

When Optimizely buckets a visitor into an experiment or personalization campaign, it fires a track_layer_decision callback. The integration catches this callback, builds a key-value pair following the AB_OP_ naming convention, and sends it to Contentsquare using the _uxa.push(["trackDynamicVariable", ...]) API. Contentsquare then associates that Dynamic Variable with the visitor's session, making it available for segmentation across heatmaps, session replays, and journey analysis.

flowchart LR
    A[Visitor lands on page] --> B[Optimizely makes bucketing decision]
    B --> C["track_layer_decision callback fires"]
    C --> D["Build AB_OP_ key-value pair"]
    D --> E["_uxa.push trackDynamicVariable"]
    E --> F[Contentsquare receives DVAR]
    F --> G[Segment heatmaps, replays, journeys]

The track_layer_decision callback provides these variables:

Variable

Type

Description

campaignId

string

The campaign (layer) ID in Optimizely

experimentId

string

The experiment ID within the campaign

variationId

string

The assigned variation ID

isHoldback

boolean

Whether the visitor is in the holdback group

campaign

object

Full campaign object with name and policy

extension

object

Custom integration settings from the form schema

The naming convention for the Dynamic Variable key is:

AB_OP_{campaignId}_{experimentId}

And the value is the variationId (or "holdback" for holdback visitors). This convention lets Contentsquare organize Optimizely data consistently and enables cross-experiment analysis.

Prerequisites

Before starting the integration, confirm the following:

  • Contentsquare tag is installed and active on the pages where you run experiments. The tag must load before or alongside the Optimizely snippet.

  • Optimizely Web Experimentation snippet is deployed on the same pages.

  • Admin access to both your Optimizely project (Settings > Integrations) and your Contentsquare workspace (Dynamic Variables configuration).

  • Dynamic Variables module is enabled in your Contentsquare account. Contact your Contentsquare CSM if you do not see Dynamic Variables in your workspace settings.

Step 1: Configure Contentsquare Dynamic Variables

Dynamic Variables (DVARs) in Contentsquare are custom key-value pairs attached to a visitor's session. You need to reserve a variable range for Optimizely data.

  1. Log into your Contentsquare workspace.

  2. Navigate to Console > Dynamic Variables (or ask your CSM to enable the module if it is not visible).

  3. Reserve a numeric range for Optimizely experiments. For example, if you plan to run up to 20 concurrent experiments, reserve keys AB_OP_* in your naming convention documentation.

  4. Verify that your Contentsquare tag version supports the trackDynamicVariable command. All Contentsquare tags from 2022 onward support this API.

No specific configuration is needed inside Contentsquare beyond ensuring the module is enabled. The trackDynamicVariable API accepts any key-value pair at runtime, and Contentsquare indexes it automatically.

Step 2: Create the JSON Integration

Optimizely Web Experimentation uses a JSON plugin system for custom analytics integrations. The JSON includes the integration metadata, optional form fields, and the track_layer_decision callback code in the options field. Pasting a single JSON block creates a complete, reusable integration that can be toggled on or off per experiment.

  1. In your Optimizely project, go to Settings > Integrations.

  2. Click Create Analytics Integration > Using JSON.

  3. Paste the following configuration:

{
  "plugin_type": "analytics_integration",
  "name": "Contentsquare (Dynamic Variables)",
  "form_schema": [
    {
      "default_value": "1",
      "field_type": "dropdown",
      "name": "Custom Variable Slot",
      "api_name": "customVar",
      "options": {
        "1": "Slot 1",
        "2": "Slot 2",
        "3": "Slot 3",
        "4": "Slot 4",
        "5": "Slot 5"
      },
      "description": "Reserved Contentsquare Dynamic Variable slot for this experiment"
    }
  ],
  "description": "Sends Optimizely experiment data to Contentsquare as Dynamic Variables",
  "options": {
    "track_layer_decision": "var campaignId = campaignId;\nvar experimentId = experimentId;\nvar variationId = variationId;\n\nvar csKey = \"AB_OP_\" + campaignId + \"_\" + experimentId;\nvar csValue = isHoldback ? \"holdback\" : String(variationId);\n\nsendToCS(csKey, csValue);\n\nfunction sendToCS(key, value) {\n  window._uxa = window._uxa || [];\n  window._uxa.push([\"trackDynamicVariable\", {\n    key: key,\n    value: value\n  }]);\n}\n"
  }
}
  1. Click Save Integration.

The form_schema creates a dropdown in the Optimizely UI that lets you select a DVAR slot per experiment. This is optional but useful if you want to separate different experiments into different Contentsquare variable slots. Many teams skip the slot system and use the AB_OP_ key directly.

The options.track_layer_decision field contains the JavaScript callback that fires each time Optimizely makes a bucketing decision for the visitor. Here is what the callback does:

  • csKey: Follows the AB_OP_{campaignId}_{experimentId} convention, creating a unique identifier per experiment.

  • csValue: Sends the variation ID so Contentsquare can distinguish between control and treatment groups. Holdback visitors get the value "holdback".

  • sendToCS(): Wraps the _uxa.push call. The window._uxa = window._uxa || [] pattern ensures the code does not error if the Contentsquare tag has not yet loaded — commands are queued and processed once the tag initializes.

Step 3: Enable and Assign the Integration

After saving the integration:

  1. Toggle the integration to Enabled in Settings > Integrations.

  2. To apply it to all new experiments by default, check Enable for all new experiments.

  3. For existing experiments, go to each experiment's Manage Campaign > Integrations tab and enable the Contentsquare integration.

  4. Optionally select the DVAR slot from the dropdown if you configured the customVar field in the form schema.

Once enabled, every new bucketing decision triggers the track_layer_decision code and sends data to Contentsquare.

Advanced: Human-Readable Names

By default, the integration sends numeric IDs (campaign ID, experiment ID, variation ID). These are reliable but hard to read in Contentsquare reports. You can enhance the integration to send human-readable names instead.

Optimizely's state.getDecisionObject() API provides experiment and variation names. However, be aware of the Mask descriptive names privacy setting in Optimizely. When this setting is enabled, getDecisionObject() returns hashed values instead of readable names.

To add human-readable names, modify the track_layer_decision code:

var campaignId = campaignId;
var experimentId = experimentId;
var variationId = variationId;

// Get human-readable names from Optimizely state
var state = window.optimizely && window.optimizely.get("state");
var experimentName = "";
var variationName = "";

if (state) {
  try {
    var decisionObj = state.getDecisionObject({ campaignId: campaignId });
    if (decisionObj) {
      experimentName = decisionObj.experiment || "";
      variationName = decisionObj.variation || "";
    }
  } catch (e) {
    // Fallback to IDs if state API fails
  }
}

// Build key with readable name or fallback to IDs
var csKey = experimentName
  ? "AB_OP_" + experimentName.replace(/\s+/g, "_")
  : "AB_OP_" + campaignId + "_" + experimentId;

var csValue = isHoldback
  ? "holdback"
  : (variationName || String(variationId));

window._uxa = window._uxa || [];
window._uxa.push(["trackDynamicVariable", { key: csKey, value: csValue }]);

The following table compares naming strategies:

Strategy

Key Example

Value Example

Pros

Cons

ID-based

AB_OP_12345_67890

99001

Always works, no privacy issues

Hard to read in reports

Name-based

AB_OP_Homepage_Hero_Test

Blue_CTA_Button

Easy to read

Breaks if "Mask descriptive names" is on

Hybrid

AB_OP_12345_Homepage_Hero

99001_Blue_CTA

Readable with ID fallback

Slightly longer keys

For most teams, the hybrid approach provides the best balance. IDs ensure uniqueness while names add readability.

Advanced: Custom Segmentation

Beyond basic variation tracking, you can build richer segments in Contentsquare using additional Optimizely data.

Segment by Campaign Type

Optimizely distinguishes between A/B tests and personalization campaigns via the campaign.policy field:

var campaignType = campaign.policy; // "single_experiment" or "multivariate"

window._uxa = window._uxa || [];
window._uxa.push(["trackDynamicVariable", {
  key: "OP_campaign_type",
  value: campaignType
}]);

This lets you filter Contentsquare data by experiment type, useful when you want to analyze personalization performance separately from A/B tests.

Track Holdback Groups

Holdback visitors are excluded from personalization campaigns to measure the campaign's incremental lift. Track them as a separate segment:

if (isHoldback) {
  window._uxa = window._uxa || [];
  window._uxa.push(["trackDynamicVariable", {
    key: "OP_holdback",
    value: String(campaignId)
  }]);
}

Multi-Experiment Visitors

When visitors participate in multiple concurrent experiments, create a combined DVAR to track the overlap:

// After collecting all active decisions
var state = window.optimizely && window.optimizely.get("state");
if (state) {
  var campaigns = state.getCampaignStates({ isActive: true });
  var experimentIds = Object.keys(campaigns).sort().join("_");

  window._uxa = window._uxa || [];
  window._uxa.push(["trackDynamicVariable", {
    key: "OP_multi_exp",
    value: experimentIds
  }]);
}

This lets you segment session replays by visitors in specific experiment combinations, revealing potential interaction effects.

A/A Testing Validation

Before relying on the integration for production experiments, run an A/A test to validate that data flows correctly and no systematic bias exists between control and variation in Contentsquare.

A/A Test Methodology

  1. Create a new A/B test in Optimizely with two identical variations (no code changes).

  2. Set traffic allocation to 50/50.

  3. Enable the Contentsquare integration for this experiment.

  4. Run the test for at least 7 days or until you reach 1,000 visitors per variation.

  5. In Contentsquare, compare the following metrics between variations:

    • Session duration

    • Pages per session

    • Click rate on key elements

    • Scroll depth

  6. Both variations should show statistically similar behavior. Differences greater than 5% on any metric suggest a timing or bucketing issue.

flowchart TD
    A[Create A/A test in Optimizely] --> B[Enable Contentsquare integration]
    B --> C[Run for 7+ days / 1000+ visitors per variation]
    C --> D{Metrics within 5% between variations?}
    D -->|Yes| E[Integration validated - proceed with real experiments]
    D -->|No| F[Check tag load order and timing]
    F --> G[Verify _uxa commands arrive before CS processes]
    G --> H[Re-run A/A test]

Validating the Integration

After setting up the integration, verify that data reaches Contentsquare correctly.

Console Verification

Open your browser's developer console on a page with an active experiment and run:

// Check if Contentsquare tag is loaded
console.log("CS tag loaded:", typeof window._uxa !== "undefined");

// Inspect queued commands
if (window._uxa) {
  window._uxa.forEach(function(cmd, i) {
    if (cmd[0] === "trackDynamicVariable") {
      console.log("DVAR #" + i + ":", JSON.stringify(cmd[1]));
    }
  });
}

You should see output like:

CS tag loaded: true
DVAR #3: {"key":"AB_OP_12345_67890","value":"99001"}

Contentsquare Tag Assistant

Install the Contentsquare Tag Assistant browser extension. It displays:

  • Whether the CS tag is active on the current page

  • All Dynamic Variables sent during the session

  • Any errors in variable formatting

Look for your AB_OP_ variables in the Tag Assistant's Dynamic Variables panel.

Optimizely State Diagnostic

To verify that Optimizely is providing the correct decision data, run this diagnostic in the console:

var state = window.optimizely && window.optimizely.get("state");
if (state) {
  var campaigns = state.getCampaignStates({ isActive: true });
  for (var id in campaigns) {
    var c = campaigns[id];
    console.log("Campaign " + id + ":", {
      experimentId: c.experiment && c.experiment.id,
      variationId: c.variation && c.variation.id,
      isHoldback: c.isInCampaignHoldback
    });
  }
} else {
  console.log("Optimizely state not available");
}

This confirms that the visitor has been bucketed and that the experiment/variation IDs match what should be sent to Contentsquare.

Analyzing Experiments in Contentsquare

Once data flows into Contentsquare, you can analyze experiment impact through several tools.

Creating Segments by Variation

  1. In Contentsquare, go to Segments.

  2. Create a new segment using Dynamic Variables as the condition.

  3. Set the key to your experiment's DVAR key (e.g., AB_OP_12345_67890).

  4. Set the value to a specific variation ID (e.g., 99001 for control).

  5. Repeat for each variation. Name the segments descriptively (e.g., "Homepage Hero - Control" and "Homepage Hero - Variation A").

Session Replay Analysis

Apply your variation segments to session replays to watch how users interact with each variation:

  1. Go to Session Replay and apply the control segment.

  2. Watch 10-20 sessions to identify common behavior patterns.

  3. Switch to the variation segment and repeat.

  4. Look for differences in scroll behavior, click patterns, rage clicks, and form interactions.

Session replays reveal qualitative insights that aggregate metrics miss. A variation might have a higher conversion rate but show increased confusion through rage clicks or hesitation patterns.

Journey Analysis per Variation

Use journey analysis to compare how users navigate through your site differently based on their variation:

  1. Go to Journey Analysis.

  2. Apply the control segment and map the visitor flow through key pages.

  3. Switch to the variation segment and compare the flow.

  4. Look for changes in drop-off points, navigation patterns, and page visit sequences.

Zone-Based Heatmaps

Heatmaps segmented by variation reveal exactly where interaction patterns differ:

  1. Go to Zoning Analysis for the page where the experiment runs.

  2. Apply the control segment and note click rates and engagement zones.

  3. Switch to the variation segment and compare.

  4. Pay attention to click distribution changes, exposure rate differences, and attractiveness metrics per zone.

Troubleshooting

Dynamic Variables Not Appearing in Contentsquare

If DVARs are not visible in Contentsquare:

  • Tag load order: The Contentsquare tag must be loaded before _uxa.push is called, or the window._uxa = window._uxa || [] queue pattern must be in place (the integration code above includes this pattern).

  • Processing delay: Contentsquare processes Dynamic Variables with a delay. New DVARs may take up to 24 hours to appear in the UI for segmentation.

  • Module not enabled: The Dynamic Variables module must be activated for your workspace. Contact your CSM if you do not see it.

Integration Not Firing

If the track_layer_decision callback does not execute:

  • Integration not enabled: Verify the integration is toggled on in Settings > Integrations and enabled for the specific experiment.

  • Visitor not bucketed: The callback only fires when Optimizely makes a bucketing decision. If the visitor does not meet audience conditions, the callback does not fire.

  • Snippet not loaded: The Optimizely snippet must be present and active on the page. Check for JavaScript errors that might prevent it from loading.

Data Discrepancies Between Platforms

Differences between Optimizely visitor counts and Contentsquare session counts are expected:

  • Counting unit: Optimizely counts unique visitors, while Contentsquare counts sessions. One visitor can have multiple sessions.

  • Tag blocking: Ad blockers or privacy tools may block the Contentsquare tag but not the Optimizely snippet (or vice versa), causing different visitor populations.

  • SPA navigation: In single-page applications, ensure the Contentsquare tag handles virtual pageviews. Missing virtual pageview tracking can cause underreporting.

  • Sampling: Contentsquare may apply session sampling on high-traffic sites.

Expect discrepancies of 5-15% between platforms. Investigate further if differences exceed 20%.

Masked Descriptive Names

If you use the human-readable names approach and see hashed values instead of experiment/variation names:

  • Check the Mask descriptive names setting in your Optimizely project under Settings > Privacy.

  • When this setting is enabled, getDecisionObject() returns obfuscated names.

  • Switch to the ID-based or hybrid naming strategy if you cannot disable masking.