Welcome!
Welcome to the Marin Software Support Center. We're glad you're here. Here's what you can look forward to:
  • Real-time search so you can find what you need faster than ever.
  • Easy-to-follow video guides for our most popular articles.
  • Interactive simulations and Live Screenshots to make learning easy.
  • Regular content updates to ensure every word you read is accurate and up-to-date.

Search Publishers

Home > English > Search Publishers > Google > How To Articles: Google > Google Campaign Experiments

Google Campaign Experiments

Introduction

Google's Campaign Experiments are a form of A/B testing, designed specifically for campaigns. They allow you to test out any changes to your account on a specific section of your campaigns. Campaign Experiments are a great way to test the water, so to speak, giving you insightful information about whether or not you really want to make a certain change. At present, Campaign Experiments allow you to test changes to keywords, bids, ad groups, and placements.

Campaign Experiments can help you make smarter decisions and give you the info you need to increase your return on investment.


 

  Important Note

Using the Campaign Experiments creates duplicate tracking which can result in revenue and conversion data attribution issues. Please follow the instructions in the 'Tracking Values And Third-Party Tracking' section of this article to ensure correct attribution. 

How Campaign Experiments work


When you create an experiment, you decide what sort of a change you want to test. For example, you could test adding new keywords, raising a bid, trying new ads or using different placements. Then you decide what percentage of your auctions should have this experimental change.

The first step in Campaign Experiments is deciding which specific changes you'd like to test. For example, you might want to find out whether the word 'footwear' performs better than the word 'shoes' in your ads. There are a wide variety of changes you could test, including adding new keywords, trying new ads, or different placements. 

The overall process for Campaign Experiments is as follows:
 

  1. You decide what you'd like to test and create a Campaign Experiment. You can also set some experimental parameters like goals, start and end time, and so on. Importantly, you can also set the traffic split percentage for this experiment -- more on this later. 
  2. Google creates a duplicate version of your campaign behind the scenes. This duplicate will be identical other than the changes you're testing. 
  3. Once your Campaign Experiment is running, when users search on Google they'll be served either version A or version B of your campaign variables. The traffic split percentage we set earlier will determine how frequently each variation of the campaign will appear to users. 
  4. After your experiment has been running for a while, you'll be able to view its performance in the same grids in the same table that you use to view performance for your campaigns and ads. 
  5. Once the experiment is over, you'll be able to decide whether to a) do nothing and discard the experiment, b) replace the old campaign with the new changed campaign, or c) keep the experimental campaign as a separate campaign from the original.
     

Now that we know how Campaign Experiments work on Google's end, let's take a look at how the Marin Platform supports this feature. 

Platform support for Campaign Experiments


The Platform currently offers support for Campaign Experiments in the following way:
 

  • You are able to sync your experimental campaigns, report on them, and even make changes to either the original or experimental campaigns. The Platform will not sync campaign drafts before they've been 'promoted' to experiments -- only the actual experimental campaigns.
     
  • The Platform cannot currently determine if a campaign is part of an experiment, and neither can it set up new campaign experiments. We plan to support this in a future release. This means that you should take care when performing campaign experiments not to make changes to the original campaign without making the exact same changes to the experiment campaign. If you do accidentally change one campaign without changing the other, be aware that this may alter the outcome of your experiment.
     
  • The above point is especially relevant if you're currently using Platform Bidding. If a campaign is on Platform Bidding and the you decide to run an experiment on it, the experiment campaign will not use Platform Bidding automatically. Additionally, even if you were to add the experimental campaign to Platform Bidding, the experiment would lack the prior data required to allow our algorithm to perform the same changes on the experimental campaign that it performs to the original campaign.

Tracking values and third-party tracking


It's important that you're aware that starting a Campaign Experiment in AdWords will produce a temporary duplicate campaign. Consequently, the campaign's keywords, ads, product groups, custom parameters, and the Marin tracking values, will be automatically duplicated. 

Therefore, in order to track Campaign Drafts and Experiments, you will need to make sure you are using the publisher group id {adgroupid} ValueTrack. Publisher Group ID is required for Marin revenue load process to understand which drafts and experiments groups having driven your website conversions and/or revenue. Publisher Creative ID and Marin Keyword ID values are no longer unique enough in this scenario.

Adding Group ValueTrack Parameter to your URLs

You will need to add Group ID to your tracking URLs. This structure may vary depending on what type of revenue integration you use. For Marin Tracker, this must be added using a parameter called pgrid.

Example URL : http:/www.test.com?mkwid={ifsearch:s}{ifcontent:c}{_mkwid}&pcrid={creative}&pgrid={adgroupid}&ptaid={targetid}

 

Example Populated URL : http:/www.test.com?mkwid=3A4az4u8&pcrid=1231223&pgrid=644227&ptaid=kwd-123:aud-456

Once this parameter is added, Marin will attribute conversions and revenue to a group based on the publisher Group ID before checking the publisher Creative ID and Keyword ID values.

Important notes


  • At any point during a Campaign Experiment, you can choose to cancel the changes you're testing, or to keep them for good. 

 

Last modified

Tags

Classifications

This page has no classifications.

 

wiki.page("Internal/Mindtouch_Launch_Sandbox/js.cookie.js")