All posts by Zorin Radovančević

Senior Web Analytics implementation and planning specialist - all things related to Google Analytics, Google Tag Manager and Yandex Metrica.

Solving Attribution with GA Measurement Protocol Payload Widening


For starters let’s have Google explain what Measurement Protocol actually is and when it makes sense to use it (explained by Justin Cutroni):

As the documentation states Measurement protocol is used for this purpose:

  • Measure user activity in new environments.
  • Tie online to offline behavior.
  • Send data from both the client and server.

You can even find some of the basic hit payload examples in the dev docs here These hits will be valid yet if sent as is it is a question how valuable the data will be perceived once in reports. A suggestion would be to widen. Always. Making the hit valid and valuable at the same time.

The issues with MP can best be explained by using a simple example such as:

Company A sells product B in multiple countries. The process contains multiple steps. 1st step where a user visits the site and fills in and successfully submits a lead form. After validation and internal risk check  we enter 2nd step where the initial transaction is processed. Delay between step 1 and 2 can be up to 14 days. 3rd step – a user can have a recurring payment contract which automates billing process each 30 days – there is no need for the end user to visit the site in order to confirm.

Scenario – Sending a valid hit with bare essentials in step 2 and 3 results in:

Device, Geo and traffic attribution will be or can easily be skewed. For instance sending a minimum valid hit will basically send information which will be generically attributed to the IP of the server sending the hit (messes up geo attribution), device sending the hit (messes up device attribution), and traffic defaults to direct (which is in some cases fine as GA uses last non direct traffic source attribution meaning IF any campaign exists prior to the MP hit tied to a specific Client Id within the Campaign Timeout window (6 months by default) the direct traffic will just be overridden).

One additional thing which may happen if scenario follows this path:

  1. step 1 form submit via google / cpc
  2. between step 1 and step 2 an additional session / visit happens via google / organic
  3. step 2 sends the transaction to GA yet the transaction will be attributed to google / organic – so it is a question which of the sources you want the transaction to be tied with – cpc or organic?
  4. subsequently recurring transaction attribution for step 3 will always subscribe to the latest known campaign and not the initial source of Lead or Transaction

In order to prevent this from happening there are some additional parameters which you should:

  1. store (in your internal system for processes such as lead form submits) and
  2. send with the measurement protocol hit

Parameters for widening the MP hit (solving attribution)

dl – document location
Circumvent any hostname filtering (dp and dh parameters as options to this)

uip – user IP
Preserve location

ua – User agent
Preserve info on user agent (optional)

ds – Data source
Segment on data source. (cdN (custom dimension) as option)

cn/cm/cs/gclid/dclid/dr – traffic – campaigns from UTM / Adwords or DC / document referrer / (none if no info exists or create a virtual direct?)
Preserve traffic – partially. The issue is to preserve attribution for the traffic source which generated the sales – if no info would be provided some mid newsletter driven session may miss attribute results.

Sending the Measurement protocol hit

Prior to pushing all your work to production you should definitely test the setup and outcomes using and a dummy Google Analytics property.  In most cases you will send the hits as events, non interaction ones so make sure ni parameter is set to 1. This may challenge the way you see the standard reports as if you send a non interaction hit something unexpected will happen – no session will be created but there will be a user reported for the period in question – the attribution will stay intact though.  MCF reports will attribute and report as expected.

I would really be interested to know which other parameter you find essential or some of your MP cases where the valid hit is just not enough – Happy tracking!

Logging Oversized Enhanced ecommerce Google Analytics hits in Google Sheets


If you ever worked on a Enhanced Ecommerce implementation project there is a chance you encountered the ‘Payload size is too large (15226).  Max allowed is 8192.’ message which actually means there will be nothing stored in reports. There are some workarounds for this particular issue such as:

In this post I’ll only focus on explaining how to determine the EE actions and hits the payload may be considered an Issue.

Step 1 – Google Sheets setup

The Sheets procedure and the entire script which reports on the questionable hits can be found here – (by Martin Hawksey). Follow the entire procedure and do not forget to publish and update the Script once you made the necessary changes!

A quick note on the Google Sheet itself. This one needs to be filled with header row so that we can map our payload to Google Sheets as in this example:


Just the header (1st) Row. The column entries are case sensitive so whatever you define in the payload sent to Google Sheets needs to exact match! For my example I used these column labels (you may notice a ‘slight’ resemblance to Measurement protocol parameters):

  • timestamp
  • payLoadLength
  • tid (tracking id)
  • cid (client id)
  • uid (user id)
  • t (type of hit)
  • pa (product action)
  • ni (non interaction)
  • dl (document location)
  • dp (document path)
  • dt (document title)
  • ec (event category)
  • ea (event action)
  • el (event label)
  • ti (transaction id)
  • tr (transaction revenue)

Make sure that the cid and ti columns are formatted as Plain Text otherwise some unwanted auto formatting may occur. Additional parameters may be added as needed.


Step 2 – Google Tag Manager Setup (Mimic the Google Analytics payload)

When proceeding with this it would be prudent to create an additional GA tag in GTM leading to a dummy GA property just to be sure nothing goes wrong with the data. Basically add a new field to set to each GA tag which passes on EE payload as in the following example:


So I will use the sendHitTask as means to read the entire payload and send it to Google Sheets once the payload exceeds a certain length. The sendHitTask field will use a Custom JS variable (Code below – based on this post) which returns a function which is designed (apologies for the sucky code) to push the default payload to GA and further on send the custom payload to Google Sheets – :

function sendHitTask(){
  return function(model) {
    var payLoad = model.get('hitPayload');                     
    var trackingBaseUrls = ['', ''];

    for (i = 0; i < trackingBaseUrls.length; i++) { var baseUrl = trackingBaseUrls[i]; if (trackingBaseUrls[i].indexOf('collect') > -1) {
          var req = new XMLHttpRequest();
'POST', baseUrl, true);
      } else if (payLoad.length > 7000){
        var payLoadExtract = payLoad.split('&');
        var payLoadArray = {};
        // Push values to array for later access
          for (i = 0; i < payLoadExtract.length; i++){
                  var splitArray = payLoadExtract[i].split('=');
                  payLoadArray[splitArray[0].trim()] = splitArray[1].trim();
            // Specify values to be sent to Google Sheets from array
            var tid = 'tid=' + payLoadArray.tid,
            cid = '&cid=' + payLoadArray.cid,
            uid = '&uid=' + payLoadArray.uid,
            t = '&t=' + payLoadArray.t,  
            pa = '&pa=' +,
            ni = '&ni=' +,
            dl = '&dl=' + payLoadArray.dl,
            dp = '&dp=' + payLoadArray.dp,
            dt = '&dt=' + payLoadArray.dt,                                      
            ec = '&ec=' +,                                      
            ea = '&ea=' + payLoadArray.ea,                                      
            el = '&el=' + payLoadArray.el,                                      
            ti = '&ti=' + payLoadArray.ti,                                      
            tr = '&tr=' +,                                      
            timestamp = '&timestamp=' + {{UTIL - Date}},
            payLoadLength = '&payLoadLength=' + payLoad.length;
            var collectPayLoad = tid + cid + uid + t + pa + ni + dl + dp + dt + ec + ea + el + ti + tr + timestamp + payLoadLength;
        // Send Values to Google Sheets
        var collectUrl = baseUrl +'?'+ collectPayLoad;
        var myImage = new Image();
        myImage.src = collectUrl;

And this is basically it. Whenever this tag is triggered it will push data to GA but if the payload exceeds the defined condition it will log it in the Google Sheet which will hopefully help you in debugging the implementation.

DataStudio how to Report Google Analytics MCF and Standard Ecommerce data from Google Sheets


How to use DataStudio in order to report on Average Days to Purchase or Average Touchpoints to purchase a specific product or category of products based on data collected in Google Analytics? Easy! For the entire setup you only need to have / do:

  1. Have access to a Google Analytics property with Ecommerce implemented
  2. Have access to DataStudio
  3. Read one of the previous articles on how to join MCF report data and Standard report data in Google Sheets

Google Sheets as Data Source

Google Sheets GA plugin extension set and queries prepared as in example:


Reporting Sheet where you do all the join operations:


Once this is set proceed to adding a Data Source inside Data Studio (Google Sheets as Data source):


Select the appropriate file and Sheet (the reporting one) and proceed in adapting the source by formatting the fields as in example:


Note that we only had to edit Aggregation settings to Average and created an additional Calculated Field Metric which basically Averages the Transaction Value based on the Count of Transactions Ids – Formula: Total Conversion Value (MCF) / COUNT(Transaction Id (MCF)).  Proceed to add any desired visualization to the report page – in this example I used Scatter Chart and Simple Scorecard. For example the Scatter Chart uses a simple setup as follows:


The final result was a simple one page dashboard:


The point of the exercise should be to easily push data to DataStudio (from any available source – even ‘offline’) in order to quite simply explore the data and relations of 2 or more variables – who knows something interesting may pop up:)



DataStudio how to create a Google Analytics Custom Channel Grouping

Google DataStudio is an excellent addition to the Google Suite tool arsenal which deals with the reporting / exploration part.  A much more complex system than what we are used to with Google Analytics Dashboards yet still really easy to use! The product is still in Beta but looks very promising.

For more information about DataStudio here are some useful links to get you up and running:

  1. Product overview –
  2. Help center –
  3. Creating a report (case) –

Back to the issue at hand:

How to create a Google Analytics Custom channel grouping inside DataStudio

Most of avid Google Analytics users are really keen on redefining the default channel grouping or adding custom channel grouping to have a better understanding of how ‘channels’ perform on a high / low detail level. Unfortunately GA Reporting API / Core API still does not have such a dimension exposed / available so in our reporting and analysis effort we have to resort to recreating the conditions with filtering. Same process has to be applied when reporting in DataStudio. So, whenever you want to widen the Default channel grouping by creating a Custom Channel grouping inside GA and want this to be accessible via DataStudio you need to do this:

  1. Add a data source – Google Analytics – procedure described in links above
  2. Add a custom field to the data source – which I will describe next

To begin with here is the official support for calculated fields –

Step 1
Create the field
04 datastudio create a calculated field


Step 2
The formula

07 datastudio building blocks


This is the tricky part. You need to mimic the order and definitions as you would in the GA Admin interface using prebuild DataStudio functions (  The basis will be the function CASE ( which uses this simple syntax:


This will translate to:

WHEN our source or campaign matches ‘some string’ THEN the dimension output will be ‘desired channel grouping dimension value’
ELSE if we missed to group some traffic inside a group it will output ‘You missed this’

The best way to proceed is to open Notepad or similar text editor and do it there as DataStudio formula entry has auto fill feature and you can only write in a single line so it will eventually strat to be extremely hard to track what is defined or what needs to be corrected.

‘Referral’ channel group
WHEN Default Channel Grouping = “Referral” THEN “Referral”

‘Paid listings’ channel group
REGEXP_MATCH(Medium, “.*(listing|xyz).*”) AND REGEXP_MATCH(Source, “.*(tripadvisor|abc).*”) THEN “Paid Listings”
In this case we are using REGEXP_MATCH function where we evaluate Medium AND Source for phrases we use when creating Custom Channel Grouping in Google Analytics.  We can also use other logical operators to create a more complex condition. Please note that REGEX we are used to in Google Analytics does not behave the same as in DataStudio (after some tinkering guys from confirmed it – thanks!)  – more on DS regex –

Your end result should have the following structure:

WHEN Default Channel Grouping = “Referral” THEN “Referral”
WHEN Default Channel Grouping = “Direct”  THEN “Direct”
WHEN Medium = “cpc” AND REGEXP_MATCH(Campaign, “.*(brand|bcmp).*”) THEN “Paid Brand Campaign”
WHEN REGEXP_MATCH(Medium, “.*(listing|xyz).*”) AND REGEXP_MATCH(Source, “.*(tripadvisor|abc).*”) THEN “Paid Listings”
… add more definitions as you see fit
ELSE “Other”

Copy your entire code and paste it inside the formula field:

05 datastudio fill in the formula

Step 3
Update the data source

07 update data source



Step 4
Change the Dimension in your widget from Default Channel grouping to your Custom Channel grouping one.

Click on the current dimensions.
09 choose the calculated field

Choose the newly created one.

10 My custom channel grouping

A small hint – add a secondary dimensions – Source / Medium just to check if all the traffic is indeed in predefined and expected channel group.

11 make sure it is applied

Step 5
Refresh the data

12 refresh the data


Final notes:

  • Any change you do inside GA should be redone inside DataStudio Custom field (at least till GA surfaces some kind of a custom channel grouping dimension via API)
  • The same approach can be used for any kind of filtering / segmentation and once you get used to the syntax it is pretty useful and offers nice customization options


Connecting products and categories with Google Analytics MCF API Intro


How to track specific product purchased by name, id, category or similar and tie the sales to the MCF API data (Multi Channel Funnel)? – as an example:

  • Black T-shirt was bought with this purchase path Organic Search > Paid Search > Social Network inside 5 days with total conversion value of $5 (Like this will ever happen:))

Some additional insight may come up:

  • Track how much time, on average, is needed for a purchase of specific product or a group of products (currently limited to max default lookback window of 30 days – shame – feature request to Google – increase to 90 days or give the ability to set it manually for the MCF API – please).
  • Track if expensive products need a longer conversion path in terms of days and touch points.
  • Track channels which create awareness for a particular group of products and find ideas for scaling sales.

You will hopefully find some delicious insights while you explore this pretty useful connection between CORE and MCF API – just by browsing the dimensions and metrics available from the APIs you’ll pile up some exploration ideas:

  •  CORE API dimensions and metrics – – please note this page includes the Reporting API v4 dimensions and metrics as well
  • MCF API dimensions and metrics –

Before getting into details we will be working with CORE API v3 as it is supported in Google Sheets Google Analytics add on. This is an intro article so the idea is to enable access to this data without unneeded tech hassle.


  1. Working ecommerce implementation
  2. At least some historic data
  3. Access to Google Analytics account / property / view
  4. Some basic Excel / Google Sheets experience

Step 1 Google Sheets setup

Using Google Analytics add on for Google Sheets create the following 2 queries using this procedure …

After you open a new Google Sheets document and installed the GA add on create a new report:

1 create new report GA sheets

Fill in the details for the GA Account / Property / View access (remember use only views with ecommerce enabled (goals may work as well)):

2 define account property view for pulling data

Add the required metrics and dimensions and build the queris for mcf and core API:

3 mcf and core queries

1st query is a MCF API call which uses:

  • mcf:totalConversionValue


  • mcf:transactionId
  • mcf:conversionDate
  • mcf:basicChannelGroupingPath
  • mcf:pathLengthInInteractionsHistogram
  • mcf:timeLagInDaysHistogram

2nd query is for CORE API (standard reports) where we pull product info:

  • ga:itemRevenue


  • ga:transactionId
  • ga:productName

Both queries can of course be customised extensively based on the questions you are trying to find answers for.

Note that the max count of results returned from the API is 10 000 if you have more than that you will have to create additional queries yet you will use the Start Index filed in the query set to 10001 and so on – there are existing plugins such as Analytics Edge for Excel which can help with that.

If a transaction can contain multiple products the entire process is a bit different so your main metric becomes item revenue instead of total conversion value. You can in that case add an additional calculation where you would basically do item revenue / transaction revenue and the attribution would be used only if the result would be > 50%. 

Once all is set you can proceed with running the report:

4 run report

The result of running the reports will be two additional Sheets which hold the data for further tasks.

5 reports in sheets

2 Google Sheets String manipulation (adding fields for more granular analysis and pivot options)

Add a new empty sheet (one of the ways):

6 add new sheet

Copy data from MCF report (Dummy MCF sheet) to the empty sheet (the new sheet should be named Reporting) (If transactions contain multiple products you will have to revert the process in terms of first copying data from Dummy Standard):

7 referencing fields in Reporting sheet

You basically reference the cell from a different sheet so it will always refresh once you pull a different data set.

Next sub step is adding some more columns for pivoting purposes (just as ideas – you can always come up with your own columns).

The idea is to come up with the First channel responsible for the sale and then the Penultimate channel. You can also do last and what not yet penultimate is something we do not often analyse but can prove to be valuable (better say the channel before the decision itself).


  • Conversion path contains the following string …
  • Display > Direct > Paid Search > Organic Search
  • We want to extract Display (First channel) in one cell and Paid Search (Penultimate channel) in a different cell.

We need to create an extraction formula which will go through all the purchase paths (regardless which channels these contain and the number of touch points). Note that the formula will be different based on which dimensions you use (channels, vs adwords campaigns vs source / medium etc.). In your newly created Reporting sheet after you referenced the Dummy MCF sheet cells successfully click on the next empty cell and enter this formula:

8 extract string

The formula for First channel is:

The formula for Penultimate channel is:
=IFERROR(TRIM(REGEXEXTRACT(TRIM(REGEXEXTRACT(C2,”[a-zA-Z() ]+ > [a-zA-Z() ]+$”)),”^[a-zA-Z() ]+”)),”-“)

Copy / paste the formulas throughout the rows (remember max rows is by default 10 000). These formulas will not work in Excel though.

3 Tie the Dummy Standard data with the new reporting tab based on transaction Id

In our Reporting tag where we did all our string manipulation we need to fill in on more column – product name in this case (it can be whatever you queried from CORE API – just remember you always need the transaction ID as it serves as a primary key).

Use the free column on the right and name it like Product Name and in the first data row enter the formula which will tie the product name based on transaction Id:

9 vlookup

Formula used to tie product to channel grouping:
=IFERROR(VLOOKUP(A2,’Dummy Standard’!$A$16:C$10015,2,FALSE),”No Product Info”)

The formula:

  • takes the value from the current sheet set in 1st column starting at A2 and
  • finds the same transaction Id value in Dummy Standard sheet
  • in the cell range from A16:C10015 and
  • returns value from the 2nd column from that sheet to our Reporting sheet.

And your data set is ready for pivoting (some reformatting may be required).

4 Pivoting

My advice, as Google Sheets pivoting is a bit limiting, is to export the Reporting sheet to CSV and import all to Excel (much easier to group and what not).


Select the data range > Create the pivot table and explore:

10 Pivot start


Quick tips to evaluate AdWords based on Profit in Google Analytics

In this article I will try to explain 2 methods which allow a more ‘accurate’ base of evaluating AdWords efforts. Word of caution we are still in the last non direct model so once you do any of the 2 methods revert back to MCF reports in order to see the cross campaign AdWords influence.

Profit using Calculated metrics

In this case we will use the dreaded on average the client has a 10% margin and in addition the client pays a fee of 15% on top of Adwords spend. The end result is something in line with Net Profit = (Revenue * 0.1) – (AdWords spend * 1.15)  – note that this is by no means an accurate representation of your profits yet it is definitely a step closer than for instance basing all on ROAS (reasons why not ROAS are nicely described here  –


  1. Ecommerce tracking
  2. AdWords and Analytics are connected
  3. Understand this is only applicable where your margin is evenly distributed through your entire assortment (no big outliers allowed:))

Step 1 

Go to view settings > Calculated metrics:Calculated metrics

Create a new metric using this formula (name it however you want but use Currency for formatting):

{{Product Revenue}} * 0.1 – {{Cost}} * 1.15

Where 0.1 is Gross Profit based on 10% margin and Cost is based on a 15% fee.

There is no step 2:) 

Well there actually is but it is just creating a Custom report where you use the newly created metric. If you like the idea of having these calculated metrics in your reports there are some others you can easily add:

  • Profit per transaction (Currency) – ( {{Product Revenue}} * 0.1 – {{Cost}} * 1.15 ) / {{Transactions}}
  • POAS (Percentage or Currency) – ( {{Product Revenue}} * 0.1 – {{Cost}} * 1.15 ) / {{Cost}} * 1.15

Important! – you need to decide if you are going to use Revenue or product revenue as the ingredients are different for almost each GA ecommerce implementation. Revenue may have shipping, taxes and similar inside yet product revenue may have only a fraction of it. 

The downside of using calculated metrics is # of Calculated metrics available (5 per view for standard) and these can only be used (currently) inside non standard reports.

The end result is usually a custom report:

Screen Shot 2016-02-22 at 23.08.02

You may notice a clear difference when looking at ROAS and a clear profit metric – much much easier to scan the report – IMHO.

Profit using data import

MAJOR UPDATE – thanks to Yehoshua Coren!

When using Enhanced ecommerce if you use data import and widen on product level you need to do some additional implementation steps in order to stop inflating the custom metric Value. By default this would happen:

Product Click Event > custom metric would be set to the value from the data import table
Product Add To cart > custom metric would be set to the value from the data import table
And so on – so each EE event passed to GA would increase the total product profit continuously (unfortunately this is how EE works).

So before you do any import your dataLayer pushes would need to contain the custom metric with the value of 0 – note the metric index number!:

'metric3': 0

For all non purchase event hits such as product impressions:

'event': 'trackImpressions',
'ecommerce': {
'impressions': [
'name': 'Triblend Android T-Shirt',
'id': '12345',
'price': '15.25',
'brand': 'Google',
'category': 'Apparel',
'variant': 'Gray',
'list': 'Search Results',
'position': 1,
'metric3': 0
'name': 'Donut Friday Scented T-Shirt',
'id': '67890',
'price': '33.75',
'brand': 'Google',
'category': 'Apparel',
'variant': 'Black',
'list': 'Search Results',
'position': 2,
'metric3': 0

For the purchase event ‘metric3’: 0 should be deleted!

As vaguely mentioned in the first method – using data import can and should be used if there are high margin deviations inside your product inventory.  The basis for this method is product id / sku. Importing per product profits make this method much more reliable in terms of end result – the calculated metric Profit (custom metric) – AdWords spend (Cost). As an additional benefit the Cost is not only applied from AdWords but also from all Cost Data Import sources as well (Facebook, Yandex and similar).

In standard Analytics Profit import works for processing time data widening – in short when a transaction containing products gets recorded in GA and if the scheme for widening exists (data import) each hit gets widened with a Custom metric / dimension of your choice. In this case we are using Product Data Import type which uses product scoped dimensions and metrics – and we need to use Enhanced Ecommerce.

Step 1 – Custom metric

One of the differences between  Custom and Calculated metrics is that Calculated ones are defined on view level and Custom ones on property level.

Define the Gross profit Custom metric in the GA admin:

Custom metric product scope

Step 2 – Create the data import data set 

Just make sure you choose Product Data Import and use the following settings:

Product DataImport screen

We need to widen the product info with the Gross profit metric we will import using a .csv file and we are using product SKU as the primary key – SELECT NO FOR OVERWRITE HIT DATA:

Once you have this solved you need to get the data from your business system and export the data set in a csv file containing, in this case, only two columns of info:

csv data import

Upload the data either from the GA interface or using the API.

Screen Shot 2016-02-22 at 22.57.45Screen Shot 2016-02-22 at 22.58.00

Screen Shot 2016-02-22 at 22.58.17

Step 3 – create the calculated metric(s)

This procedure is the same as in the initial calculated metrics method only. Instead of Revenue / Product Revenue we will use the Gross profit Custom metric instead.

  • Net Profit (Currency) – {{Gross profit}} – {{Cost}} * 1.15
  • Profit per transaction (Currency) – ( {{Gross profit}} – {{Cost}} * 1.15 ) / {{Transactions}}
  • POAS (Percentage or Currency) – ( {{Gross profit}} – {{Cost}} * 1.15 ) / {{Cost}} * 1.15

The application of these metrics is completely the same as seen in the first method – create a custom reports or pull all data out and use some 3rd party tool to do additional analysis and visualizations.