Welcome!
Welcome to the Marin Software Support Center. We're glad you're here. Here's what you can look forward to:
  • Real-time search so you can find what you need faster than ever.
  • Easy-to-follow video guides for our most popular articles.
  • Interactive simulations and Live Screenshots to make learning easy.
  • Regular content updates to ensure every word you read is accurate and up-to-date.

Managing Campaigns

Home > English > Managing Campaigns > Keywords, Creatives, and Campaigns > How To Articles: Keywords, Creatives & Campaigns > Creative Testing [Marin Search]

Creative Testing [Marin Search]

Introduction

The Creative A/B testing tool automatically analyzes performance to identify any creatives over/under performing relative to the ad group average, based on user settings. The success criteria and settings for creative AB tests (testing the performance of different ad copies) and landing page AB tests are determined at the level of the Marin account.

Note: Creative testing settings do not currently exist in MarinOne, so your tests should still be set up in Marin Search. However, you can still view the results of your creative tests in the grids in MarinOne by following the steps listed below.

What Is Creative Testing? 


Creative testing is a built-in Platform feature that gives you the confidence and control you need to make smarter decisions about your creatives. It allows you to easily compare the performance of one creative to another, giving you valuable real-world data you can use to improve your campaign strategy.  
 


 

You can use the A/B testing tool to choose from a number of criteria by which to test your creatives. For example, you can compare them by conversions & impressions, clicks & impressions, Return On Investment (ROI), and many more. Once you've chosen your settings, you'll be able to use the A/B Testing filter in the main Creatives grid to check the status of your creatives at any time. You can filter for WinnerLoser, or a Draw. You can also use the View Builder to bring the A/B Test column into the grid and see the per-creative results at-a-glance. 

Note that to get the best results from creative testing, we recommend that you have your ads set to Do Not Optimize: Rotate ads indefinitely in your group-level settings. To learn more about ad rotation settings, check out our article Ad Rotation Options with Google.

How The Platform Calculates Winners, Losers, Or A Draw

When determining Winners and Losers within an ad group, the Platform will check the selected criteria (e.g. Conversions / Impressions) for each creative, then compare the result against the average for the other creatives in the group. If there is a statistical difference between a creative and the group average, that creative will be marked as a Winner or Loser. If there's no statistical difference, it'll be marked as a Draw

Note: A draw can also mean that there's not yet enough data for the Platform to declare a winner; for example, if the thresholds or confidence level haven't been met.

A Note About Confidence Level 

When setting up your creative testing, you'll notice the Confidence Level section of the settings page.

It works like this: When comparing a creative against the average of all other creatives in the group, the Platform will use your chosen Confidence Level to determine if there is a significant difference or not. The Platform uses an algorithm known as the Student's T-test to calculate the statistical confidence level between two values (for example, the conversion rate of two creatives). If you're not sure about this setting, we recommend leaving it at the default value of 90%. 

Best Practices  


Before we dive into the details of setting up your Creative Test, let's take a look at a few best practices.
 

  • If you're testing across muliple campaigns or groups, tag your creatives with dimensions in order to determine aggregate best performers. 
     
  • When you have a winner, pause your losing creatives and continue the testing process. 
     
  • Your publisher group-level settings should be set to Do Not Optimize: Rotate ads indefinitely

 

Note: Any differences between ads will cause Marin to see these ads as distinct for testing purposes. This includes capitalization in headlines. 

How To 


There are two parts to using creative testing with the Platform: A/B testing setup and analyzing the results. We'll go through each of these below.

Setting Up Your Creative Testing   

The first thing to do in order to use creative testing with your campaigns is to set up the testing criteria. To do that, just follow these simple steps:
 

  1. Click Admin in the upper-right.
     
  2. Head to the Optimization tab.
     
  3. Click on the A/B Testing link in the left-hand menu. 
     
  4. Once you're on the relevant settings page, you can use the A/B Test Metric drop-down menu to choose the criteria by which you'd like to judge your creatives. There are plenty of options available, and the one you choose will depend on your unique business goals. Feel free to discuss this choice with your Platform representative for the best results.



     
  5. Next, click into the Advanced Settings section and make any changes you'd like here. You can select the amount of days' worth of data you'd like to use for the test, as well as altering the thresholds for impressions, clicks, and conversions. Adjusting these settings can help you tweak the testing to meet your specific needs.



     
  6. Click Save and you're all set to start gathering data for your A/B creative testing!
     

Analyzing The Results Of Your Creative Testing 

Once your creative test has been running for your chosen period of time, it's time to check the results. You can analyze your results in a variety of different ways in Marin Search or MarinOne.

As a starting point, sort your active groups by highest spenders for the month. Bring in a Bubble chart and identify the high spend groups with the lowest conversion rate (revenue return) and CTR. These are prime starting points for A/B testing.

Use the tool to identify best performing ad copy for these groups. Pause the Losers. Clone and create message variations of Winners.

Additionally, utilize dimensions by tagging new creatives with time stamps to easily track progress. If you’re testing the same copy across multiple groups or campaigns, give them one tag within a dimension to determine aggregated best performers.

Note: At this time, Marin only offers creative testing at the creative level. Testing cannot be performed at the group or campaign level. These settings may, however, be available on the channel themselves (on Google or Microsoft's ad site) and we do offer Dimensions to aggregate your creative-level data in any group you would like. 

Marin Search   

  1. From the main Creatives tab, look at the Filters section on the left of the screen.
     
  2. From the A/B Test section, you can choose to show All of your test results, or filter specifically for Winners or Losers. These values are decided based on the criteria you set in the steps above.



     

You can also use the View Builder to see the results of your creative testing live in the grid. Just follow these steps:
 

  1. From the main Creatives tab, click into the View Builder.
     
  2. From the Advanced section of the View Builder, click the relevant checkbox for the A/B Test column.




     
  3. Finally, Save your new view and the A/B Test column will appear in the grid with the relevant results shown.

MarinOne 

Creative testing settings do not currently exist in MarinOne, so your tests should still be set up in Marin Search. However, you can still view the results of your creative tests in the grids in MarinOne by following the steps listed below.

  1. First, navigate to the Ads tab. 
     
  2. From there, click on the Column Selector in the top-right above the grid. 
     
  3. From the Column Selector, locate the A/B Test column and check the corresponding box to add it to the grid. 
     

    clipboard_ed0ab7bec8b0ae1a0aa91bd039e03d739.png


     
  4. From the A/B Test column, use the filters to view WinnerLoser, or Draw
     

    clipboard_e509a8917178807f3a1ec6856700483d9.png

Best Practices  

  • If you're testing across muliple campaigns or groups, tag your creatives with dimensions in order to determine aggregate best performers. 
     
  • When you have a winner, pause your losing creatives and continue the testing process, always pitting your winning creatives up against new creatives. 
     
  • Your publisher group-level ad serving settings should be set to Do Not Optimize: Rotate ads indefinitely. To access your creative rotation settings, click into to the group of your choosing, then click on the settings sub-tab, and scroll down to the Ad Serving setting. From there you can update your Ad Serving settings and click Save.

    Ad Serving.png
     

  • In your AB Testing settings, we recommend using Profit/impression for clients with revenue data and Conversions / Impressions for those without. Of course, you should choose the metric that best suits your account needs, but these should serve as a best practice. The recommended confidence level setting is between 85-95% to determine a clear winner. Advanced settings are available for click, impression and time threshold minimums.

 

Last modified

Tags

This page has no custom tags.

Classifications

Complete
Marin Search
Complete

 

wiki.page("Internal/Mindtouch_Launch_Sandbox/js.cookie.js")