Sitecore’s Corporate Marketing team brought in SBOS to support updating Sitecore.com this year. This blog is part of a series outlining how SBOS supported the optimization of the site for business value.
After developing a strategic roadmap at an onsite workshop, we deployed multiple new personalization scenarios across the site to optimize for business value. We also refined existing taxonomies and configured foundation data points in the Marketing Control Panel, including Goals, Campaign taxonomy, and Profiles for pattern matching.
The alignment of goal conversions (and their relative value) across the site established baseline engagement metrics like Value per Visit. Able to measure the engagement of visitors and segments on the site, we were ready to explore A/B testing.
We began with a call to align teams for launching A/B tests on several pages to improve conversion rates of specific goals. On it, we reviewed a few concepts covered in this video, including: launching a test, measuring the results, and organizing an ongoing testing program.
Content testing in Sitecore
In Sitecore, the Experience Optimization tool is directly on the Launchpad. Within the Sitecore lexicon, the term itself is synonymous with testing. Experience Optimization allows marketers to show different experiences — different versions of content — to learn what best drives engagement on the site. The Testing capabilities are flexible in Sitecore. I like to look at the different options for testing in two categories: single-variable tests or multiple-variable tests.
I recommend starting with a single variable test.
The most common test is a simple component test, started by creating a few different variants of a single component on a page and testing them against each other. Marketers can also test entire page layouts with a version test or page substitution test. A version test compares different versions of a page, while a page-substitution test tracks the effect of completely different pages in the site structure.
Whether components, versions, or entire pages, all three of these tests rotate through variants of a single variable.
The last two types of tests are multiple-variable tests: personalization tests and multivariate tests.
Personalization tests can be begun on components where personalization rules are already configured. They help you better understand the effect of personalized content. When you launch a Personalization Test, Sitecore automatically “holds back” half of the traffic that meets each condition as a control group. This control group is exposed to the default test variant. I plan to describe the hold-back strategy (where Sitecore “holds back” half of the traffic that meets each personalization rule as a control group) and measurement of personalization tests in an upcoming blog dedicated to this topic.
Finally, you have the option to launch multivariate tests. Imagine you have a hero component with 3 different variants and a call to action with 3 different variants. In a multivariate test, the system cycles through all possible combinations of those two components (this increases exponentially, creating 9 different possible experiences). Multivariate tests can be powerful, but when compared to a single variable test they require a much larger volume of traffic to prove statistical significance.
You can download a copy of the SBOS test hypothesis template here.
Above is an example of a hypothesis generated during the SBOS engagement for Sitecore.com. At a high level, Sitecore marketers are testing the effect of tweaking the subtitle in the Path to Personalization page hero. By default, the hero has a short single sentence. The test variant features a subtitle that lists reasons to download the whitepaper. This one-page hypothesis slide lists the test type — a simple A/B component test. Marketers included the page URL and a basic plain-English hypothesis. Note that the objective focuses on a specific goal on the engagement value scale: “Download General Asset.” The marketing team also listed a start and end date for the test.
Best practices for tests
Involve a broad team during personalization and optimization planning sessions. This ensures across-the-organization buy in of all the team members involved in planning. Start with a couple of small tests on high impact pages. Don’t start your first test on the Homepage.
Consider launching initial tests on landing pages as you develop your process. Always begin with a hypothesis. It’s important to identify a single goal on the site and then report on the effect of the test on that specific goal. This keeps the marketing team honest while reviewing results. If you commit to reporting on the conversion rate of the goal you set out to optimize, then you aren’t at risk of “cherry picking” results.
Finally, schedule a recurring call dedicated to sharing test results. Get into a rhythm of broadcasting results. Whether it's monthly or quarterly, you can involve a wider team with key stakeholders across your organization to only share the results of recent tests. Share both positive and negative results of tests. The only items discussed in this call should be test results, pull other items into more relevant calls. You’ll likely be surprised at the ideas coming from unexpected areas of the organization when you begin sharing test results.
The internal Sitecore team started using Trello to manage planned optimization efforts. A new “Active Test” column was added to our Kanban board. This gives everyone involved a clear view of ongoing and upcoming tests. At the time of publication, Sitecore has several basic component tests planned and will share results later in this blog series.
Jonathan is a Senior Marketing Technologist on Sitecore’s SBOS team. He advises industry-leading brands on optimizing digital experiences with a sharp focus on strategy and configuration. Jonathan is an Atlanta native and passionate about music and emerging technologies.