To start, you’ll need to make a copy of this Google Sheet template.
Ensure you have available to you the script for the tool, hosted on Github here.
You should also have open the Google Ads account that you are running your test in.
In the GSheet, open the tab titled Test Configurator, then enter the name of your test and select the test type you would like to run. You will find 3 options, each of which will require a different set-up within Google Ads:
Label based testing – for when you’re using labels to compare different arms of the test, over one time period (for example: ads, ad groups or campaigns)
Testing using Drafts & Experiments (D&E) – for when you’re using Google’s built-in Drafts & Experiments feature, over one time period
Pre/post testing – for when you want to compare activity across two different time periods
Once you have selected an option, further fields will populate accordingly. The setup for each is outlined below.
After selecting Label as your Test Type, fill out the remaining fields. These are:
Label type i.e. whether your test is run at ad, ad group or campaign level
The number of variants in your test (this can be any number from 2 to 6)
Once you’ve clicked Add Test, navigate to the Registry: Labels tab. Here you will find your test details have auto-populated. At this point you are also able to amend the end date of the test, which by default is set to the current date. Keeping this setting will cause test measurement to run indefinitely.
In Column E, Variant Name, you'll be able to edit the automatically generated names to something that is easy for you to identify (i.e. Price based ad variant, etc).
Column F, the Label column, contains auto-generated labels for you to add to your variants within Google Ads. Details on how to add labels within your account can be found in this support article.
One of the most common use cases we have internally for this type of test is for ad copy.
We often set up tests where we have 3 types of ad copy running across multiple ad groups and campaigns and use this tool to evaluate when one is performing significantly different than the others.
We generally recommend looking to identify the worst-performing variant (not the best), and replacing this with a new variant. By doing this, you ensure that your campaigns are gaining continuous marginal improvements over time.
More information on setting up and running ad copy testing can be found in this article.
Experiment (D&E) tests
The only additional input to select here after selecting your Test Type is the start date of your D&E test.
Once this has been selected, Add Test then navigate to the tab: Registry - Experiments.
Take the value automatically generated in column C, Experiment Suffix, and append it to the end of your Experiment campaign name in Google Ads.
It’s important to note some crucial points here:
The set-up will not work if the Experiment Suffix is added to any other part of the campaign name, other than the very end
Only edit the Experiment campaign name - the name of the original campaign should not be updated
Out of all the test types available to evaluate using the Experimentation Studio, D&Es are arguably the fairest in terms of set-up, as they allow randomisation at the cookie or impression level. We quite often use D&E of tests when trialling new bidding strategies.
After selecting Pre Post as your Test Type in the Test Configurator tab, you may again select your testing entity in the Label Type row.
You can then select the date ranges for each of your testing periods.
You may choose different lengths of time for each period but it is important to note that this will negatively impact the fairness of your test.
Further considerations to take into account are the different factors at play during the time ranges you have chosen, such as seasonality, bid or budget changes etc.
Apply the values shown in column E, Labels, to each variant as outlined in the Label tests section above.
Because of the various factors at play here, pre-post testing is generally less favourable than label based or D&E testing, and is traditionally used when it’s not possible to test your variants concurrently. For example, if you’re changing a feature of a product where it is not possible to create a traffic split.
Using the script
Once test setup is complete and you have added your labels (for Label and/or Pre/Post tests) or added the Experiment Suffix to your D&E campaign name, you’re ready to put the scripts into your Google Ads account.
The script can be found here, at GitHub.
Sign in to your Google Ads account and click on the Tools icon at the top of the screen. In the drop-down that then opens up, under Bulk Actions select Scripts.
Click on the blue “+” icon to create a new script.
Be sure to delete anything that is already in the box for the script insertion in Google Ads first, then copy and paste the full script from GitHub into Google Ads.
You may choose to edit the script name at the top (highlighted red below). This will allow you to easily differentiate between the different scripts you enter.
The script is broken down into sections, with the orange text providing an explanation of what each section does.
The section that begins with “Edit Me” (highlighted yellow above) indicates that you will need to supply information there in order for the script to work.
You are required to supply your GSheet ID, to allow the script to collect information on the test configurations you set up earlier.
This ID can be found in the URL of your GSheet. It is the value between "/d/" and "/edit".
Set the script to run and authorise it when prompted, to allow it to access the account. Note that if you authorise before you run the script, you may have to repeat this when you press run. You can set the script to run at any desired frequency - our recommendation is once daily (usually first thing in the morning).
Following the above steps, a query will be created to pull raw data from Google Ads, summed by either label value or campaign name appendage.
This will then be formatted for export and visible in your Gsheet, as a unique data import sheet created for each test run. The tab will be titled Data import: [test name].