The Only Two Metrics That Actually Matter In Advertising

There are only two metrics that actually matter in advertising: cost and revenue. Cost is readily available in all ad platforms, but revenue can be difficult and costly to obtain. Because of this, many agencies and marketing departments fail to track this crucial metric, leaving them to spend client dollars blindly.

Advertising’s Sole Purpose Is to Drive Sales

It’s easy to lose focus on the ultimate goal of sales with the staggering amount of data relating to bounces, traffic, impressions, view counts, likes, comments, shares, and heart reactions. But, bounce rates become irrelevant when your content doesn’t sell. Traffic doesn’t matter when your website doesn’t convert. Facebook page likes are completely meaningless when people don’t click your posts.

It’s not that marketing VPs, account executives, and creative directors don’t agree with the sentiment that advertising should drive sales, it’s that they fall into the trap of trying to derive value from sources where there is none. 

This phenomenon goes back a lot further than just the digital age, as made evident by this David Ogilvy clip from several decades ago:

Ogilvy praises direct advertisers for their ability to measure results “to a dollar.” While the entire 7-minute speech is worth a listen, the relevant sentiment is best summed up here:

You direct response people know what kind of advertising works and what doesn’t work, you know to a dollar…The general advertisers and their agencies know almost nothing for sure because they cannot measure the results of their advertising.

We recently worked with a client who had been sold (and burned) by another agency that Ogilvy might call general. The client had entrusted the agency with a 6-figure ad buy promoting a single event, and they were rightfully furious that the agency was unable to tell them how much revenue had been generated after the campaign was done (though, I’m sure the agency’s accountant had no problem telling the client how much money they’d spent). That client contracted Flint Analytics to fix and automate their revenue attribution, making tracking a non-issue for future campaigns.

Learn how Flint Analytics may be able to help you solve your company’s revenue attribution problems.

We believe there is a certain amount of negligence and irresponsibility in spending client dollars without having the slightest idea how much value that client received in return. For ecommerce, you can and should know, down to the dollar, your earnings from each individual advertising source. For clients with longer sales cycles involving multiple consultations and touchpoints, you can and should know, at a minimum, the source of every single lead generated. If you cannot report on the true business impact of your spend, you are doing your client a disservice.

If you’re the type who can sell a ketchup popsicle to a woman in white gloves, and you use that ability to convince clients to spend money on the idea that weak metrics like social follows or heart emojis on posts are absolutely good for business, you’re sitting somewhere between misguided and dishonest.

The Narrative Fallacy

To be fair, a lot of marketing professionals seem to honestly and wholeheartedly believe that increased pageviews, lower bounce rates, higher domain authority, and more social engagement definitively (key word) lead to more sales. This is innocent enough. When agencies and marketing departments are technically incapable of tracking actual sales results, they tend to justify their strategies, actions, and budgets by taking any stats they can — no matter their relevance — and creating causal links where they might not actually exist. In his book The Black Swan, Nassim Taleb writes extensively on this practice, referring to it as the narrative fallacy. He writes:

The narrative fallacy addresses our limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, an arrow of relationship, upon them. Explanations bind facts together. They make them all the more easily remembered; they help them make more sense. Where this propensity can go wrong is when it increases our impression of understanding.

Think about that in the context of digital marketing: sometimes all we have are sequences of facts in the form of sessions, bounce rates, engagement, view counts, impressions, and more. It’s information overload, and our job security is partially dependent on convincing clients and superiors that we can make sense of it all.

Any marketer with a basic Google Analytics setup — but no way to tie revenue back to its source — is probably guilty of committing a narrative fallacy. If quarterly sales increased, they might tell their CEO that increased web traffic from clicks on a Facebook campaign were the root cause of the increase in sales. If this marketer was an agency pitching a potential client whose sales had been dropping, they may point to a lack of advertising budget as the root cause. The same goes for anyone pitching or justifying SEO work, a creative strategy, or social media management: these are all things that can help business, but they do not definitively do so.

This is why revenue tracking is so important: it eliminates the human tendency to rely on biases and allows you to focus on identifying marketing practices that have real, valuable business impact. It is only through knowing how much money your efforts made that you can accurately analyze if the money you spent was worth it.

Need Help with Revenue Tracking?

The technical process of tracking revenue is extensive and worthy of volumes, which is why we’ve limited this article mostly to the importance of this data point. If your company or your client is struggling with revenue attribution, contact us to schedule a free consultation.

Integrating Ecommerce and Facebook

Integrating Google Enhanced Ecommerce and the Facebook Pixel

This article is for those who:

  • use Google Tag Manager and the data layer to implement enhanced ecommerce (see our handy guide),
  • use Google Tag Manager to manage their Facebook Pixel, and
  • are interested in running dynamic retargeting ads for a Facebook Catalog.h

In order to send existing enhanced ecommerce data into Facebook, you will have to fulfill the requirements needed to run your dynamic retargeting ads. At a minimum, you need to send the associated parameters for the ViewContent, AddToCart, and Purchase Facebook events. Each variable can be extracted from an existing enhanced ecommerce setup.

Let’s Begin With Some User Defined Variables

For starters, you’ll need to create a few user-defined variables in Google Tag Manager that will extract data from the enhanced ecommerce code and translate it into a format readable by Facebook.

Variable 1: FB – Product Detail ID

This first variable will pull the Product ID from a product detail impression. This will be used in both your ViewContent and AddToCart tags. The configuration for this is fairly simple, just create a data layer variable type named FB – Product Detail ID with the following settings:

Variable 2: FB – Revenue

Then, you will need to send a value along with the Purchase event so you can properly track and optimize for revenue within Facebook. This can be extracted directly from the enhanced ecommerce purchase array that you should get every time a transaction occurs.

To do this, create a data layer variable type with the name FB – Revenue using the following settings:

Variable 3: FB – ecommerce.purchase.products / FB – Purchase ID Array

Additionally, you will need to send an array of product IDs to track which products were purchased by Facebook users. But to get to the end variable, we will actually need to create another variable first.

Therefore, create a data layer variable called FB – ecommerce.purchase.products with the following configuration:

This will pull the individual product IDs from your purchase array.

Next, create a custom JavaScript variable in Google Tag Manager called FB – Purchase ID Array using the following script:

function() {
 var products = {{FB – ecommerce.purchase.products}};
 return products.reduce(function(arr, prod) { return arr.concat(; }, []);

This will create an array of product IDs for all products that are purchased that can be passed into Facebook.

Now You Can Configure Some Tags

Using these variables, you can now configure tags for the Facebook events ViewContent, AddToCart, and Purchase.

REMEMBER: Use advanced settings to make sure that your base Facebook Pixel fires before any of these tags, otherwise you risk losing data.

Tag 1: ViewContent

You can now create a custom HTML tag for sending a ViewContent event into Facebook. Use the following code:

 fbq(‘track’, ‘ViewContent’, {
 content_ids: {{FB – Product Detail ID}},
 content_type: ‘product’

<img height=”1″ width=”1″ alt=”” style=”display:none”
src=”<INSERT_FB_ID>&ev=ViewContent&cd[content_ids]={{FB – Product Detail ID}}&cd[content_type]=product/>

As far as triggering goes, that is up to you. We typically set this as a pageview to a product page only so that we can limit tracking to meaningful content views only.

Tag 2: AddToCart

Your AddToCart tag should be triggered by your existing enhanced ecommerce addToCart trigger. Use the following code for this tag:

 fbq(‘track’, ‘AddToCart’, {
 content_ids: {{FB – Product Detail ID}},
 content_type: ‘product’

<img height=”1″ width=”1″ alt=”” style=”display:none”
src=”<INSERT_FB_ID>&ev=AddToCart&cd[content_ids]={{FB – Product Detail ID}}&cd[content_type]=product/>

Tag 3: Purchase

Your Purchase tag will use the following code:

 fbq(‘track’, ‘Purchase’, {
 content_ids: [{{FB – Purchase ID Array}}],
 content_type: ‘product’,
 value: {{FB – Revenue}},
 currency: ‘USD’

<img height=”1″ width=”1″ alt=”” style=”display:none”
src=”<INSERT_FB_ID>&ev=Purchase&cd[content_ids]=[{{FB – Purchase ID Array}}]&cd[content_type]=product&cd[value]={{FB – Revenue}}&cd[currency]=USD/>

In enhanced ecommerce, transactions are not events. Instead, they are actions defined in an array (as explained here). For triggering, just use whatever identifier that confirms a transaction has occurred (ex. Confirmation Page viewed, button click event, etc).


These methods will not only save you time in setting up your Facebook Pixel for dynamic retargeting, they will also allow you to more easily integrate your Google and Facebook advertising strategies (which is also the subject of our next analytics article!).

Need Help with Your Analytics Integration?

Flint Analytics can help your company execute tough analytics setups like this. If you want help, contact us for a free consultation today!

Math for Marketers: Quadratic Trendlines

Math for Marketers: Quadratic Trendlines

Disclaimer: We’re going to be using some calculus and linear regression here. The math can be a bit boring, so bear with me. We’ll get to the fun applied part after getting through some of the need-to-knows. If you’d like to skip the theory and go straight to the application, click here.


At Flint Analytics, we specialize in hyper-local marketing strategies for multi-location businesses. What that means is that we utilize geographic siloing to a large degree: every city we’re marketing to kind of exists in its own little strategy bubble.

While this detailed level of attention yields fantastic results for our clients, it can make my job as an analyst very difficult. Things may look great for the client as a whole, but how do we sift through the data quickly and efficiently to know when things might be going wrong in, say,  just the Oklahoma City market?

There are several different methods for accomplishing this goal. Today we’re going to discuss how to use quadratic trendlines and their mathematical properties to quickly and efficiently identify potential problems in your multi-location data.


What is a Quadratic Trendline?

Generally, a quadratic trendline is a second-order polynomial which attempts to best fit a set of data. The equation will look something like this:

In our application, the x-value will be a measure of time like {1, 2, …, n}, and the y-value will be our KPI (sessions, leads, organic traffic, etc). The most important part of this equation is or the quadratic term within the equation, as it allows us to do some interesting analysis with regards to current trends and changes over time.

Let’s say you’re analysing traffic trends over a period of 60 days and your best fit quadratic trendline is:

In your data, the x-values are days {1, 2, …, n}, and your y-values are the actual session counts for each day. Graphing this equation alongside the actual data looks like this:


The blue line is the actual day-to-day data, and the red line is the quadratic trendline. You can easily notice two things:

  1. The day-to-day data fluctuates periodically every 7 or so days, suggesting some weekly trends.
  2. The trend line hits a low point somewhere in the late 20s or early 30s.

With the periodicity of the day-to-day data, it’s very difficult to visualize whether things are trending in a good direction over time. One thing we could do is find the slope of the best fit line without a quadratic term. In this case, our equation would be:

The slope tells us that things are technically trending positively over 60 days, but the coefficient .00286 is practically 0, which doesn’t really give us a lot to work with. This basically says “things are going up…barely.” The reason the quadratic equation gives us so much more to work with is that it has a critical point.

Warning: Calculus Ahead Let’s bring back the quadratic equation and take its first and second derivatives:

From the first derivative, we can find that there is a critical point somewhere around day 30:

From the second derivative, we know that the equation is convex. Recall these basic rules:

If y’’ > 0, y is convex

If y’’ < 0, y is concave

A concave function looks like a sweet Peyton Manning touchdown pass (it goes up then comes down), while a convex function is like a skater dropping into a halfpipe (you get it). Or, more precisely:


So we know that our trendline is convex and that it has a critical point around day 30. What that means from an analytics standpoint is that during the 60 day period of analysis, things had started off trending in a negative direction before they began trending positively around day 30. Maybe we started a new ad campaign on day 30? Maybe our luck turned around? The answer is irrelevant to the issue at hand: what’s important is that we now have a mathematical way of breaking out trends into two date ranges separated by a critical point.


Interpreting Trends with Convex and Concave Curves

At any specific point in time, a quadratic trend can exist in 1 of 6 states, visualized below.



If a trend is in the Negative, Decreasing state, it is losing value over time but at a decreasing rate. This is important, because it tells us that each successive day is losing less value than it lost the day before and the trend is approaching a critical point. When the trend passes the critical point, it will generally move into the Positive, Increasing state, which is commonly referred to as hockey stick growth.

If a trend is in the Positive, Decreasing state, it is experiencing diminishing returns. A value in this state should be monitored for future changes, as passing the critical point will take it into the Negative, Increasing state. In the latter state, value is diminishing at an increasing rate overtime, resembling a nosedive.

If your trend is in the Minima state of a convex curve, it is expected to begin hockey stick growth the next day. If your trend is in the Maxima state of a concave curve, it will begin its nosedive the next day.


Translating the Math to Google Sheets or Excel

Let’s take our data and create a report like this in Excel or Google Sheets:


This report allows us to quickly visualize the trends of our target cities. Each individual location will require the same analysis that I outline below, so as you’re reading, keep in mind that you will have to replicate this process multiple times.

Recall that we will need the actual day-to-day data, a set of days relative to 1, and the squared days. So, our data may look like this, where the grey-shaded cells correspond to the cell column and row references:

data_tableData Table

In another table, we will estimate the quadratic trendline using the linest() function, as well as perform the calculations necessary to find the critical point and the trend. Let’s say that we’re placing our data manipulations on the same sheet as the data table, starting at row 5. Our output for, say Oklahoma City, will look like this:

data_manipulationsData Manipulations

Before getting into the calculations, let me explain the data contained in each column.

  • x2 – the coefficient on the quadratic term in the linear regression
  • x – the coefficient on the x term in the linear regression
  • CP – the critical point calculated from the linear regression
  • Shape – whether the curve created by the linear regression is convex or concave
  • Trend State – whether we are currently seeing the metric of interest increase or decrease and the acceleration of the change

Now, let’s look at how to generate these calculations.

The Linear Regression (Columns B and C)

Here we use the linest() function to generate a regression equation in which Total Traffic is the dependent variable and Day and Day2 are the independent variables. The linest() function spreads each coefficient across several cells, so in order to contain our output to the first two coefficients, we use the index() function as such:

Cell B6


Cell C6


The Critical Point (Column D)

Return to the general form of our quadratic equation:

In order to find the critical point for this equation, you need to take the derivative, set the derivative equal to 0, and solve for x. Generally, this will always look like this with any second order polynomial:

In our application, this general form translates to:


In order to simplify analysis down the line, I like to add the round() function to this equation:

Cell D6


Curve Shape (Column E)

Curve shape is determined entirely by the sign of the quadratic term, and can be calculated very simply with a nested IF statement:

Cell E6


Note that we have added the condition Linear as well. It’s very rare that trend data will be absolutely flat, but this would occur if the coefficient on the quadratic term was 0. This is a catchall for that extremely rare case only, and in practice, you will most likely never see any truly linear trends with this method.

Trend State (Column F)

Finally, we get to the point of this entire debacle: estimating the trend state of our data. Recall the 6 states we can be in, depending on the shape of the curve:

  1. Negative, Decreasing
  2. Positive, Increasing
  3. Minima
  4. Positive, Decreasing
  5. Negative, Increasing
  6. Maxima

These states are based entirely on the current time relative to the critical point and the sign of the coefficient on the quadratic term. We have calculated all of this data, so now we can simply use a sequence of if statements to determine the state.

Cell F6 =if(and(E6=“Convex”,60<D6),“Negative, Decreasing”,

if(and(E6=“Convex”,60>D6),“Positive, Increasing”,


if(and(E6=“Concave”,60<D6),“Positive, Decreasing”,

if(and(E6=“Concave”,60>D6),“Negative, Increasing”,


Note that I’m using the number 60 as a placeholder for the current date, since we are using rolling 60-day data for this example. This can be replaced with any variable that suits your analysis.

I spend the majority of my days staring at spreadsheets trying to figure out better ways of organizing and visualizing complex marketing data. If you have an analysis problem you’d like more help on, feel free to email me at or call (317) 993-3411.

Let us solve your analytics problem.

13 Fans Will Be Arrested If The Steelers Win This Weekend

13 Fans Will Be Arrested If The Steelers Win This Weekend

We can’t tell you which teams will win this weekend’s AFC and NFC championship games, but we can tell you how many people we think will be arrested at each. In Atlanta, if the Falcons win, 1 fan will be arrested. If Green Bay wins, 3 fans will be arrested. In Foxborough, if the Patriots win, 11 fans will be arrested. And if the Steelers win, 13 fans will be arrested.

How We Made These Predictions

In a grad level econometrics class at Purdue, my classmates and I were tasked with presenting statistical findings from socioeconomic data. While most economists are busy figuring out how to predict the next financial crisis, solve poverty, or predict how many people will buy their company’s next product, I was more interested in what I might be able to learn about my favorite rowdy NFL fans from the Washington Post’s NFL arrests dataset.

Kent Babb and Steven Rich of the Washington Post organized public record requests from police departments that oversaw NFL stadium security between 2011 and 2015, and they provided the total number of arrests made at each game along with other game-specific stats like the time of day that the game was played, the final scores of the home and away team, and whether or not it was a division game or went into overtime. Of the 31 jurisdictions in which there is an NFL stadium (note that the Giants and the Jets reside in the same jurisdiction), Cleveland and New Orleans were the only precincts that did not submit any data at all. Buffalo, Miami, and Oakland provided only partial records and had to be omitted from the data set, which honestly kind of sucks, because we know Oakland would have the GOAT arrest totals. Precincts in Detroit, Minneapolis, and Atlanta also excluded parking lot arrests, which means we might be missing a few booze-fueled tailgating arrests from this study.

I wanted to create a model to predict the chances of getting arrested at an NFL game. The Washington Post data set gave me a few of the pieces of the puzzle in knowing whether or not the home team won and the time of the game, but I had a few other questions.

  1. Did higher attendance relative to stadium capacity lead to rowdier fans, and, more arrests?
  2. Did people tend to drink more on a hot day and find themselves in the tank more often than cold days?
  3. Did controlling for local crime rates change the probability of being arrested?

In order to answer these questions, I merged data from Pro-Football on game attendance, NFL stadium seating capacity from Wikipedia, game time weather from, and the FBI’s Uniform Crime Report with the Washington Post dataset to come up with the best overall look at local conditions during an NFL game that I could.

The Model

The model I generated from this data tells us that the probability of being arrested at an NFL game is higher if:

  1. The home team loses
  2. The game starts later
  3. Attendance is low relative to stadium capacity
  4. The weather is hot
  5. The local violent crime rate is high
  6. The local property crime rate is low 

If the model relied on factors 1, 3, and 4 alone, Jacksonville would have this one in the bag!


I’ll explain the math used with broad strokes here, but if you really want the details, here’s the original term paper I submitted to Purdue. It should be noted that I received a 98% on the paper, which is enough of a confidence boost for me to publicly reveal this model. It should also be noted that my mom has yet to hang this term paper on the fridge.

The model was estimated using linear regression, and here is the primary equation:

arrest2attend = .0023167 – .000024hometeamwin + .0001295gametime – .000265attend2capac + .0000514ltemp + .0000508lhomeviocrmrt – .0003217lhomepropcrmrt

Let’s also take a second to explain each variable in the model:

  1. arrest2attend = total arrests at an NFL game divided by the total attendance for that game
  2. hometeamwin = 1 if the home team won, 0 if they lost
  3. gametime = the local time that the game was played expressed as a fraction of 1 (i.e. if the game had started at noon, it would be represented as .50 because noon is half way through the day)
  4. attend2capac = total game attendance divided by the stadium’s capacity
  5. ltemp = the natural log of the temperature (in Fahrenheit) during game play
  6. lhomeviocrmrt = the natural log of the local violent crime rate
  7. lhomepropcrmrt = the natural log of the local property crime rate

For the stats nerds out there, this model had an R-Squared of .2885, and the heteroskedastic robust standard errors for each independent variable implied statistical significance to at least the 10% level. For those who hated stats, this model was, scientifically speaking, not bad.


The implications of the model actually make sense. If the home team loses, and we assume the majority of the crowd at any game is there to support the home team, we can expect a few upset fans to act irrationally, or, criminally. If the game starts later, there’s more time for tailgating and all the fun that comes with that.

The fact that lower relative attendance implied a higher probability of arrests may not reveal anything about social conditions at a game, but more about the nature of statistics. Let’s say 5 people each are arrested at 2 different games attended by 40,000 and 50,000 people, respectively. The chance of being arrested (arrests divided by attendance) at the game with lower attendance is 25% higher than that at the higher attended game, even though the total number of arrests was the same.

The weather is another interesting factor. Anecdotally, sun and heat lead to day drinking (naturally, Green Bay residents may be the exception). Heat can lead to higher instances of dehydration, which may lead to higher cases of public intoxication. But, this could also be a testament to regional cultures. Perhaps police forces in the south, where it’s warmer, are more likely to make arrests than in the north. Or maybe fans in warmer regions really are just rowdier than the north (though, back to our data omissions, we’d really like to officially verify this against Oakland and Buffalo).

With crime rates, one might attribute the difference in violent crime rates and property crime rates to police force allocation. In a region with high relative violent crime rates, precincts may be more likely to send officers into stadiums in an attempt to curb violence (our model thanks Philadelphia for not making the playoffs this year). In an area with higher relative property crime rates, precincts may be more likely to keep officers on the streets to deter theft and vandalism while the city is otherwise preoccupied with the game.

It should be noted that while the Washington Post’s data set included data from 2011-2015, I was only able to retrieve weather data from 2011-2013. So, this model is based on game data that is several years old, and should be taken with a grain of salt. That said, we used this model to predict the number of arrests that might occur this weekend at each conference championship game.

Predicting the Number of Arrests at This Weekend’s AFC and NFC Championship Games

We had to look at two outcomes for each game: one where the home team won, and one where they lost. From there, we just filled in the blanks on the model provided above. Green Bay at Atlanta is set to start at 3:05pm local (.628 in our time units), and Pittsburgh at New England is set to start at 6:40pm local (.778). Being conference championship games, we expect sellout crowds, so we set the attendance-to-capacity parameter to 1. The current game time forecast (as of the morning of January 19th, 2017) for Atlanta is 68 degrees Fahrenheit, and the same for Foxborough is 47. In Atlanta, the average violent crime rate for the last 4 years has been 399.3 for every 100,000 residents, and the property crime rate has been 3,374.4 out of 100,000. In Boston, the violent crime rate has been 507.3 out of 100,000, and the property crime rate has been 2,208 out of 100,000.

Plugging all of these numbers into the model above, and multiplying the expected probability of being arrested at each game by the expected attendance, we arrived at the predictions stated in the intro.

The results table looks something like this:



More than likely, the arrest figures we predicted here will not hold since we’re using regular season data obtained from 2011-2013 to predict events at postseason games played in 2017. In actuality, arrests may be higher due to the playoff atmosphere and likely larger security and police presence. Further, linear regression is not a perfect science, and this model cannot account for unexpected events like fan rioting, or some goofball in a hat and a red shirt showing up. Regardless, since the variables in the model make sense from a socioeconomic standpoint, it will be interesting to see how our predictions for this weekend’s game time arrests hold up in practice!

Let us solve your analytics problem.

Creating a Custom Facebook Audience by Passing Variables from Your Website with Google Tag Manager


Let’s say your company owns 10 car dealerships in 3 different cities. Dealerships A, B, & C are located in City 1, Dealerships D, E, & F are located in City 2, and Dealerships G, H, I, & J are located in City 3. Each of these dealerships offers 3 types of car: Trucks, Sedans, and Minivans.

Let’s say a potential customer visits Dealership A’s website and looks at a Sedan, but does not convert. We might assume that the user is looking for a Sedan in City 1, but has not yet made a decision on which Sedan to buy. Using a custom Facebook audience, we could show this user an ad containing links to the different Sedan offerings at Dealerships A, B & C, since they are all located in City 1. Using the Data Layer and Facebook’s Conversion Pixel, this is actually a fairly simple setup.

Define Variables with Data Layer and Google Tag Manager

For our example, we really only need to know two things:

1. The city where the dealership is located
2. The car type that the user viewed

Let’s define these variables as dealerCity and carType. You will need to pass these variables to the data layer, which is a subject you can learn more about here.

Now you will need to create these variables within Google Tag Manager. Navigate to the User-Defined Variables box in the Variables tab, and click New. Assuming that the values of these variables will be available in your site’s data layer, you can choose the Data Layer Variable configuration type, and simply input the variable name in the field.

Note: If you are having trouble getting your data layer to populate properly, but you can define either of these variables with URL information (such as a query string containing something like ?dealerCity=Indianapolis&&carType=Sedan), then you might try creating a lookup table instead of a data layer variable. You can read more about the lookup table macro here and here.

Facebook Pixel Tag

If you have not yet created your Facebook pixel, read how to do so here.

Now you’ll need to add your Facebook Pixel as a tag in GTM. Create a new tag and name this tag Facebook Pixel All Pages, or, whatever naming convention you’d like to use to indicate that this is your base tag for Facebook. Paste the code generated by Facebook into a Custom HTML Tag Configuration, and set this tag to fire on all pages. This will ensure that Facebook’s pixel is always active and ready to pass data back to your Facebook Ad Account.

Tag for Custom Facebook Event

Now we want to create a custom event tag that passes your custom data into Facebook. Facebook uses the fbq call to indicate data arrays, and the trackCustom call for passing non-standard arrays (Facebook supports 9 standard data types, which you can read about here).

The code below is intended to send a City_Cars event array to your Facebook conversion pixel, containing the variables dealerCity and carType. Within Facebook, the variables dealercity_id and cartype_id will carry the data layer values in {{dealerCity}} and {{carType}}.

Note that you must define these variables in both the fbq call and the query string on the URL within the <noscript> section. The number contained in id=1111111111111111 will be equal to the ID number in your standard Facebook conversion pixel. Set ev= to the name of the event (City_Cars in our case), and then follow the format &cd[variable_id]={{GTM variable name}} for all values contained within the fbq array.

Under Tag Sequencing, make sure that your Facebook Pixel All Pages tag is firing first. This is important, as your Facebook Pixel must be active before data can be passed through it.

Set Up Custom Audience in Facebook

Let’s now create the remarketing list for potential customers looking for Sedans in City 1.

Go to the Audiences section of your Ads Manager and click Create Audience. Under the Website Traffic dropdown, choose Custom Combination. Then, under the Include drop-down, choose Event, and enter City_Cars in the Choose an event field. Now you will be able to add the individual parameters dealercity_id and cartypes_id and choose the values City 1 and Sedan, respectively.

Now you can use your new audience to target potential buyers who are interested in finding a Sedan in City 1. You can also set up an audience for each city and car type combination, and set up ads that target each unique pairing.


Photo Credit:

Let us solve your Facebook advertising problem.

Setting Up Enhanced Ecommerce Using Google Tag Manager and the Data Layer

Enhanced Ecommerce is hard. Like, frustratingly hard. The setup is complex, time-consuming, and downright confusing, and it doesn’t help that Google’s support documents for the setup process are scattered and incomplete. But here at Flint Analytics, we decided to take all this disjointed documentation and bundle it all together in one mega-guide to understanding the Enhanced Ecommerce setup from start to finish. This post is intended to help you understand both the big picture reasons and nitty-gritty, step-by-step details in implementing Enhanced Ecommerce using Google Tag Manager and the Data Layer. The entire setup requires a fundamental understanding of some key factors, thus the sections are broken down as follows:

  1. What is Enhanced Ecommerce?

Understanding the reporting capabilities of Enhanced Ecommerce and the business motivation behind it.

  1. Understanding Ecommerce Data Types

Enhanced Ecommerce recognizes four data types (impression data, product data, promotion data, and action data), and this section describes the application and capabilities of each data type.

  1. Defining Tracking Requirements

Using the motivations and understanding from sections 1 and 2, you’ll need to figure out the specific tracking needs of the ecommerce site.

  1. Setting up the Data Layer

Converting variables from an ecommerce database into a format readable by Google Tag Manager.

  1. Configuring Google Tag Manager

Tag Manager will be used to facilitate the passing of data from the Data Layer to Google Analytics. This will require defining tags and variables.

  1. Setting up Google Analytics

How to enable Enhanced Ecommerce in a View and how to set up the Checkout Funnel.


Section 1. What is Enhanced Ecommerce?

Enhanced Ecommerce allows sites to segment their shopping behavior in a more granular way than the standard ecommerce and conversion reports in Google Analytics. Via this Google Support article, Enhanced Ecommerce provides the following metrics and reports that standard ecommerce and Google Analytics do not provide.

Shopping Behavior Analysis

This report allows you to analyze general site behavior from product impressions to transactions. So say, for instance, you wanted to know conversion rates between the following steps:

  1. Session Starts
  2. Product Impressions/Views
  3. Adds to Cart
  4. Checkouts
  5. Completed Transactions

This type of granularity allows you to find high-level inefficiencies in your sales funnel and gives a basis for making improvements. Here is an example of a Shopping Behavior Analysis report.

Enhanced Ecommerce Shopping Behavior Analysis Example

Checkout Behavior Analysis

This report is very similar to the Shopping Behavior Analysis report, but instead it is broken down into your checkout funnel steps. So, for example, let’s say that when a user on your site initiates the checkout process and carries through to a transaction, they follow these steps:

  1. Login
  2. Customer Information
  3. Billing Information
  4. Review Order
  5. Confirmation

This report would allow you to identify choke points in your checkout process and make improvements accordingly.

Product Performance

These reports allow you to view stats related to specific products. So say your store sold three types of shirts: green, yellow, and red. Where the Shopping Behavior Analysis and Checkout Behavior Analysis reports would show aggregate information for all shirts, a Product Performance Report would show information only about green shirts, yellow shirts, or red. From this report we can see ratios like the number of products added to cart and purchased compared to their total views, as well as product revenue, number of purchases, quantity sold, average price/quantity, number of refunds, impressions and product views, cart additions and removals, and whether or not the product was part of a list (to be discussed in detail later).

Sales Performance

This report shows all details related to ecommerce revenue, including transaction details, taxes, shipping, refunds, and quantity sold.

Product List Performance

Going back to our shirt example, let’s say the green, yellow, and red shirts are part of your “Stoplight Collection” (branding genius, I know). From an Enhanced Ecommerce perspective, these shirts would belong to the Stoplight Collection List. From this report, you’ll be able to break down the views and clicks through to a specific product from a list.

Internal Promotion

If you have any marketing materials on your site, say for instance an advertisement for a sale, you can categorize user behavior by the amount of impressions and clicks on promotions.

Order Coupon and Product Coupon

This allows you to segment data based on whether or not a coupon was used at either the product or order level.

Affiliate Code

If you have any affiliates, you can segment revenue based on their performance.


Section 2. Understanding Enhanced Ecommerce Data Types

Enhanced Ecommerce reports are dependent on a consistent and well-structured data stream. This is accomplished through implementing a richer Data Layer structure on the site that sends information in the form of impression data, product data, promotion data, and action data.

Impression Data

Say you want to know the conversion rate for a single product, from start to finish. For example, when somebody loads the homepage of your ecommerce site, they see a list of products: 3 shirts, 3 books, and 3 toys, summing to 9 total product impressions on that homepage. Each of those products may be shown on other pages within the site as well, as the customer continues browsing. Let’s say one of our shirts is shown on 3 different pages each during 10 different sessions. That would be 30 impressions of the same shirt across all 10 sessions. Let’s also say that during those 10 sessions, the shirt was purchased 1 time. So, the conversion rate for that shirt with respect to number of impressions would be 1/30 = 3.33%.

Each shirt, book, and toy is different from the other shirt, book, and toy, and carries with it some specific information on brand, category, price, etc. In the Data Layer, you can define an impressionFieldObject, which is an array containing all of the information about a specific product that is being viewed. So now let’s say you wanted to know the category conversion rate, where in this case, the categories are shirts, books, and toys. Let’s assume all 3 shirts are shown on all 3 pages during each of the 10 sessions, totaling to 90 shirts impressions. Let’s also assume that 5 shirts were purchased in the 10 sessions. The shirts conversion rate with respect to impressions is now 2/90 = 2.22%.

So, the general motivation for tracking product impressions is to allow ecommerce sites to segment their various products by different parameters for understanding conversion behavior. The following table lists and describes the variables that Google recognizes automatically in the impressionFieldObject array.

Enhanced Ecommerce impressionFieldObject Variables

Product Data

Let’s go back to our original shirt example, where the same shirt was shown on 3 different pages during 10 sessions. Let’s say each impression of this shirt contains a link that directs to another page containing more information, or details, about the shirt itself. Of those 30 impressions, 3 of the 10 users click on the shirt during their session. That would mean that 3/30 = 10% of the product impressions result in the user clicking through to view more details about the product. This information can be used to understand what images/copy users respond to on the site, which products garner more interest, etc.

The productFieldObject array contains information specific to the product being viewed, and the following table contains variables that are automatically recognized by Google in this array.

Enhanced Ecommerce productFieldObject Variables

Promotion Data

Say an ecommerce site is advertising a promotion on certain pages, such as an ad for a sale. You might want to know if the promotion has any effect on purchase volume and revenue, so you’ll want to know conversion rates relative to promotion impressions. The effect of specific promotions can be segmented using the variables in a promoFieldObject, which is described in the table below.

Enhanced Ecommerce promoFieldObject Variables

Action Data

Within an ecommerce site, several ecommerce-related actions can occur. Returning to our shirt example, let’s say you are currently on the product detail page for a specific shirt and you decide that you like the shirt enough to buy it. The first thing you’ll do is add that shirt to your cart. Then maybe that action will take you to the first step of the checkout process. Maybe you had previously added another shirt to the cart, but you’d rather have the new one, so you remove the original shirt from your cart. Now that you’ve got your cart in order, you finish the checkout process and make a purchase.

Each of these items is an action, and the following table lists and describes all of the ecommerce actions that Google recognizes automatically.

Enhanced Ecommerce Actions

Each of these actions can carry action-specific data as well, which is defined by the actionFieldObject. On a purchase, for example, you would want to know total revenue of the transaction as well as the tax and shipping on the items sold. For an item being added to a cart, you might want to know which list it originated from (example, tickets for a specific Cincinnati Reds game might have originated from a list of upcoming Cincinnati Reds games).

Here are the specific variables recognized automatically by Google in the actionFieldObject.

Enhanced Ecommerce actionFieldObject Variables


Section 3. Defining Tracking Requirements

This portion of the process requires a fundamental understanding of the ecommerce site’s business model, Google’s predefined data types, and the types of reporting questions you may have. Let’s use Google’s fictitious Demo Store, Ibby’s T-Shirt Shop as an example for determining tracking requirements.

Ibby’s T-Shirt Shop Business Model

Ibby sells t-shirts. Those t-shirts have different colors, sizes, brands, prices, etc. When somebody goes to the site’s homepage, they are greeted with a list of different t-shirts (Compton, Comverges, Flexigen, etc). The user can choose which shirts to view and, potentially, purchase. When somebody makes a purchase, Ibby’s revenue is the price of the t-shirts sold multiplied by the quantity plus the tax cost plus the shipping cost.

It’s a pretty simple financial model, but how does Ibby know how to optimize the flow of her site to result in more purchases or higher revenue? That’s where defining the tracking requirements comes in.

Ibby’s Tracking Requirements

Using the lingo from Section 2, as well as Ibby’s website, let’s translate the entire sales process into Enhanced Ecommerce-speak. In the next section we will discuss the technical process for passing this data, but for now we just want to provide a framework for understanding what’s important from a business standpoint.

Product Impressions

In order to properly track the sales cycle of a product from start to finish, we want to track details specific to each impression of a product. On the homepage of Ibby’s site, note that each shirt can be uniquely identified through text (name, price, and ID) or picture (blank yellow shirt vs pink shirt with brick wall). Assumedly, your product database will contain all of the information unique to each product and you will want to pass this to Google Analytics for easier analysis. Refer back to the impressionFieldObject array described earlier: while it is required that you track product id and name, you should also track any of the following that you have available: list (ie homepage product list, suggested items list), brand, category (ie t-shirts, books, toys), variant (ie yellow, red), position (relative to other items in list), and price.

Product Clicks and Product Detail Impressions

While product impressions give users a preview of the t-shirts Ibby offers, product detail impressions give users more information and the option to add a specific shirt to their cart. Say you’re interested in the Flexigen T-Shirt, so you click (remember this is a defined action type) through to its product detail page. Most of the variables in the productFieldObject are the same as the impressionFieldObject, and with good reason: we want to have consistency when reporting the number of users who see a product on the site (product impression) and then choose to learn more about it (product detail impression).

Add Product to Cart

You’re on the product detail page and you’ve decided you’re going to buy the shirt. You choose the trim color and the size and then click the ‘Add To Cart’ button. Ibby may want to know the most popular color and size for the Flexigen T-Shirt, so she would define these in the productFieldObject for the add to cart action. As we saw before, variant is a predefined type, so we may assign trim color to that field. Since we can only use variant once, we’ll also need a way to track the size, which we can do with a custom dimension in the array (we will discuss the specifics of this implementation later, but it is important to note that most anything can be tracked, even variables outside of the scope of those that Google recognizes automatically).


Ibby’s checkout process involves 5 steps:

  1. Start Checkout
  2. Customer Information
  3. Billing Information
  4. Review Cart
  5. Confirmation

In each of these steps, we will carry through the original product detail array that was populated during the add to cart action, and we will also track events that lead to cart abandonment.

Note on step 3, we have the option as a customer to select Visa, Mastercard, or AmEx as our preferred method of payment. This will be defined as a checkout option, another custom item that can be tracked for analysis purposes.


This is the obvious goal for an ecommerce business, and we want to make sure that we are actively tracking the entire product detail array through to the purchase so that we can make optimizations and accurate reports about the entire sales process.

Other Variables and Actions That May Occur

Remember that somebody can usually remove an item from their cart, view or click on a promotion, request a refund, leave the checkout to continue shopping and return, etc. Review the data and action types from section 2 to determine what other variables and actions you may want to track.


Section 4. Setting Up the Data Layer

In short, the Data Layer is a JavaScript array that contains data readable by Google Tag Manager. If you are not familiar with the Data Layer, read this Google article first before proceeding.

How It Works

Via Simo Ahava, Enhanced Ecommerce via the Data Layer works in 4 simple steps:

Simo Ahava Explains How Enhanced Ecommerce Works

So essentially, the Data Layer translates data from the site’s platform, CMS, or database and converts it to a format readable by Google Tag Manager and then Google Analytics. This is important to note, as this particular piece of the puzzle will require the most communication with the development team.

Data Layer Code for Each Type of Action and View

We will use Ibby’s T-Shirt Shop as the example for each type of implementation, providing page-specific links for each step, so you can follow along for a running example. You can click the information iconinfo sign on pages to view the Data Layer code being passed for specific actions and views.

At the end of each section, I will also provide Google’s more generalized code example for each type of implementation, via this Google support article. By comparing the examples from a live site (Ibby’s T-Shirt Shop) and Google’s generalized examples, you should be able to piece together everything you’ll need to properly pass information to the Data Layer.

Measuring Product Impressions

On Ibby’s homepage, click the info icon in the gray header. You’ll notice a few things: a dataLayer.push() containing an array of impressions. Each of these individual impressions contains an impressionFieldObject array containing id, name, price, brand, category, position, and list. Scroll further and you’ll see that the value for list changes from “homepage” to “shirts you may like”, indicating that the product impressions on this page belong to two different lists. This is a great example of how to pass impressions data to the Data Layer.

Via Google, you can measure product impressions by pushing an impression action and the impressionFieldObject arrays to the Data Layer:

Measuring Promotion Impressions

Return to Ibby’s homepage and click again the info icon in the header. Scroll past the list of impressions to the end of the code and find the block starting with “promoView”. This registers an impression of the “Back To School” right-rail ad on Ibby’s homepage. Notice that it is included with the long list of impressions, so within this code segment we have:

  1. Impressions of shirts in the “homepage” list
  2. Impressions of shirts in the “shirts you may like” list
  3. Impressions of a promotion

Via Google, set the promoView key in the ecommerce data layer var to a promoFieldObject. Here is Google’s generalized example:

Measuring Promotion Clicks

Staying on Ibby’s homepage, click on the info icon within the “Back to School” promotion. This code creates an event to register a click on a promotion.

Via Google, push the promoClick action and the promoFieldObject array to the Data Layer. Here is Google’s more generalized example:

Measuring Product Clicks

Staying on Ibby’s homepage, click one of the info icons in the upper right-hand corner of a shirt. When somebody clicks through to a shirt from a list of shirts, we want to register a click event. Notice now that, instead of an impressionFieldObject array, we are now switching to the productFieldObject array. This array contains a lot of the same data, but differentiates itself from a general high-level impression by declaring “products”.

Via Google, push the click action to the Data Layer along with a productFieldObejct array containing information about the product that was clicked. Here is a more generalized example:

Measuring View of Product Details

I decided to click through to the “Compton T-Shirt” detail page on Ibby’s site so that I could learn more about the shirt. Click on the info icon to the upper right of the picture of the yellow shirt to see everything Ibby is passing to the Data Layer on this page. There are two things here that we’ve already seen: Product Impressions and Promotion Impressions. But in between these two lists in the code is a block that starts with “detail”: this is to signify a view of a product detail page. Notice that, similarly to the Product Click, we are also filling the productFieldObject array with data specific to the shirt.

Generally, we will measure this product detail view by pushing a detail action and the productFieldObject to the data layer.

Adding a Product to a Shopping Cart

Remaining on the “Compton T-Shirt” detail page, click the info icon next to the green “Add To Cart” button. This code registers a click event specific to adding an item to your cart.

Via Google, measure an addition by passing add to the actionFieldObject and then passing the productFieldObject array as well.

Removing a Product from a Shopping Cart

Similarly, click the info icon next to the red “Remove From Cart” button on the “Compton T-Shirt” detail page. The only difference between this and the “Add To Cart” code is that we’re using the removeFromCart event and passing remove to the actionFieldObject.

Measuring Checkout Steps

Click the green “Checkout” button in the right-rail on any page of Ibby’s site. This will take you to a checkout process defined by the following steps:

  1. Start Checkout
  2. Customer Information
  3. Billing Information
  4. Review Cart
  5. Confirmation

By clicking the info icon in the ribbon for each of these tabs, you’ll see that the main difference between each step is the “step” declaration in the actionField.

Pass the checkout action as well as the step and productFieldObject arrays to the Data Layer. Here is a generalized example of step 1. Note the “option: Visa” in the actionField, we will discuss checkout options in more details in the next description.

Measuring Checkout Options

Navigate to the Billing Information tab in Ibby’s checkout process and scroll down to the bottom of the form where you can choose your credit card option. Choose one of these options then click the info icon in the blue “Next” button. Assuming that a user cannot move on in the checkout process without declaring their form of payment, we’ll register a checkoutOption event when the user clicks to move on to the next step. Notice that the actionField contains both the step and the option.

Here is a more generalized example from Google:

Measuring Purchases

Navigate to Ibby’s Review Cart tab and click the info icon next to the green “Purchase” button. Here Ibby passes the purchase action as well as a transaction event, which will be set up later to fire an enhanced ecommerce-enabled tag. Note that a transaction occurs on Ibby’s site when somebody clicks the “Purchase” button, thus requiring the transaction event. But, if Ibby’s site was configured such that a purchase confirmation was dependent on a pageview (maybe a thank you page), Ibby could do away with the transaction event and instead configure her GTM tag to trigger on a page view.

Ibby also passes the transaction ID (or id) and the productFieldObject arrays for all products being purchased. Here is Google’s generalized example:

Measuring Refunds

Add any item to the cart in Ibby’s store and complete a checkout. On the confirmation page, you will see a red “Get a Refund!” button, click it. The resulting popup will have two more buttons: a red “Refund Full Cart” option and a yellow “Refund Selected Item(s)” button. Both of these are very similar, differing only in the addition of a productFieldObject on partial refunds to denote the specific products from the transaction that are to be refunded (as opposed to refunding every item being purchased).

Here is Google’s generalized example of a full refund:

And here is Google’s generalized example of a partial refund:

Passing Product-Scoped Custom Dimensions

Return to the “Compton T-Shirt” detail page and click the info icon next to the green “Add To Cart” button. Notice within the productFieldObject array that there is a non-standard variable dimension1 whose value is M. As mentioned before, there are a limited number of variables that Google recognizes automatically for enhanced ecommerce reports, but you can create product-scoped Custom Dimensions and Custom Metrics that will carry through in the productFieldObject array. Since the standard variant variable is already taking the color value, Ibby has used dimension1 to pass the size of the shirt through with the order.

If you are unfamiliar with the general concept of custom dimensions & metrics, read this Google Support article.

Here is Google’s generalized example of a product-scoped custom dimension, this example within a purchase action. Note that you can use these dimensions anywhere within the productFieldObject array.


Section 5. Configuring Google Tag Manager

For each of the different Data Layer implementations above, you will need to create tags in Google Tag Manager that will send this information from the Data Layer to Google Analytics. I will outline the tag setup for each type here. You can name your tags after the title of each section.

Also note that these setups should be correct in most situations, but variations may arise depending on page structure, event setups in the Data Layer, and the way your CMS loads. If you need to make changes, start with conditions at the Track Type level and work from there.

Product Impression

Tag type : Universal Analytics
Track type : Pageview
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals gtm.dom

Promotion Impression

Tag type : Universal Analytics
Track type : Pageview
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals gtm.dom

Promotion Click

Tag type : Universal Analytics
Track type : Event
Event Category: Ecommerce
Event Action: Promotion Click
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals promotionClick

Product Click

Tag type : Universal Analytics
Track type : Event
Event Category: Ecommerce
Event Action: Product Click
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals productClick

Product Detail Impression

Tag type : Universal Analytics
Track type : Pageview
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals gtm.dom

Add To Cart

Tag type : Universal Analytics
Track type : Event
Event Category: Ecommerce
Event Action: Add to Cart
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals addToCart

Remove From Cart

Tag type : Universal Analytics
Track type : Event
Event Category: Ecommerce
Event Action: Remove from Cart
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals removeFromCart


Tag type : Universal Analytics
Track type : Event
Event Category: Ecommerce
Event Action: Checkout
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals checkout

Checkout Option

Tag type : Universal Analytics
Track type : Event
Event Category: Ecommerce
Event Action: Checkout Option
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals checkoutOption

Purchase (Pageview)

Tag type : Universal Analytics
Track type : Pageview
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals gtm.dom

Purchase (Transaction Event)

Tag type : Universal Analytics
Track type : Event
Event Category: Ecommerce
Event Action: Transaction
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals transaction


Tag type : Universal Analytics
Track type : Pageview
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
More settings > Fields to Set: select the field name page and set its value to {{page path}}
Trigger: event equals gtm.dom

Passing Product-Scoped Custom Dimensions and Metrics

Rather than setting up a tag, you will need to define all custom dimensions as variables in GTM, and then declare those within your Universal Analytics tag (you will also need to set these up within Google Analytics, but we will discuss that in the next section).

Ibby placed custom dimension dimension1 in her productFieldObject array to capture the size options (S, M, L) that a user chose before adding to cart. This dimension was persistent in the array through the entirety of the checkout process so that the size of the shirt could be attributed to the purchase. In order to pass this to GA with GTM, you first need to declare a Data Layer Variable named Size with Data Layer Variable Name = dimension1. Save this and go to the More settings dropdown in your Universal Analytics tag. Click on the Custom Dimensions dropdown, click Add Custom Dimension, set the Index equal to the dimension number (in this case 1) and the Dimension Value equal to the Size variable you just created.


Section 6. Setting up Google Analytics for Enhanced Ecommerce

Assuming that your Data Layer and GTM tags are set up and working properly, we can now set up Enhanced Ecommerce in our Google Analytics view.

Turning on Enhance Ecommerce

Go to the Admin tab in Google Analytics for your selected Account. Under View, you should see the Ecommerce Settings option. Click on that, then flip the switch for Enable Ecommerce Settings. This will bring up an optional Checkout Labeling widget, which is an option you should take for better shopping cart behavior reporting. All you need to do is add and name funnel steps according to your checkout process. When you are done, click Submit.

Custom Dimensions and Metrics

Go back to your Admin tab and find Custom Definitions. In our example for Ibby’s T-Shirt Shop, we set up dimension1 to pass the Size variable. Click on Custom Dimensions and the red New Custom Dimension button. Name the custom dimension Size and choose Product in the “scope” dropdown (the scope of a dimension is a whole ‘nother animal, but you should read about it here if you plan to enable any session, hit, user, or product-scoped dimensions in your Enhanced Ecommerce).


Photo Credit:

Let us solve your analytics problem.

3 Reasons Analytics Dashboards Are So Important for Multi-Location Businesses

(And any businesses for that matter.)

Unless you are in a self-driving Google car, driving down the road without a dashboard is going to cause a lot of problems. If something goes wrong you have no warning, and if you’re speeding you have no idea until the cop gives you a ticket. The same thing happens when you have no dashboards for your business. Just like you would never operate a car without a dashboard, you really wouldn’t want to operate 100 cars at once without a dashboard, which is what it would be like for many multi-location businesses where they have multiple websites.

So what are the most important reasons to use analytics dashboards to run your multi-location business marketing campaigns?

1. Help identify errors in your data

We have seen many times where customers are monitoring several websites at once, whether they are using a specific multi-location CMS or they are on another platform and have several very different web properties for different parts of their business. It can be hard to keep up with the tracking differences between each site or even app. Because there are so many issues, such as events, confirmation page changes, custom dimensions, goals, and more to keep track of, it can be easy for tracking to fall through the cracks. And when bonuses and marketing strategy are set based on these metrics, bad data can be super painful, personally, and for the company.

Dashboards help solve this problem as it puts all your sites key metrics in view. So if the development team makes a change without letting you know; i.e. removing event tags on one site, or if the marketing team forgets to tell the people in charge of analytics about a new off nav landing page built on some third party tool and moves all the traffic there, you can figure out what is wrong before you lose too much data. The dashboard highlights this. The important part is to ensure that these issues are caught first thing This is done by assigning alerts and tolerances on the dashboard so you can know whether or not something seems too good to be true, or things seem way worse than they normally would.

2. Quickly spot problem areas or locations



Of course with franchises and other multi-locations businesses you really need to check on the performance of each individual location. Is a location performing really well this quarter or unusually poor this quarter or month? Identify early on by setting up goal pacing for each site, so you can take action before it gets too late in the month or quarter when whatever you do to improve likely won’t matter in that goal period.  Utilizing pacing charts to help identify problem areas leads to the next important reason for dashboard usage.

3. Gives you action steps if set up properly

When set up properly, marketing analytics dashboards will not only give the c-suite pretty reports, but it will also give help to everyday managers. This gives you action steps for each month or quarter as you are able to know what problem areas need the most work. And if you have chosen the right metrics in your dashboard you will be able to know exactly what you should do for each location, whether that be adjusting ad spend, editing landing pages, refocusing ad targeting, etc. It can all be easily identified if the metrics are properly set up. If you don’t build your dashboard with a purpose, you will not identify the things that you need to make decisions.

What is the right way to set up a dashboard?

One word. Planning. Planning the dashboard is the most important item. I could go on, but we are going to write more about developing your dashboard strategy in the future. But remember a few simple rules for now.

  • Start with your business objectives
  • Identify your KPIs
  • Set goals
  • Set tolerances to ensure you are going to hit your KPI goals

If you don’t have a dashboard: Get started!

If you don’t have dashboards to keep constant track of all your locations, it is time to get started. There are tons of great tools out there right now. So research them. Whether you want to use Tableau, Supermetrics, Looker, or other tools, there are many options to make it work. Just get started, and talk to a professional if you really want to make your dashboards useful.


Photo Credit:

Let us solve your analytics problem.

How to Track Landing Page Templates Across Multiple Websites Using Crazy Egg and Wildcards


Let’s say you run a multi-location business with several locally-optimized websites (i.e.,,, etc). These sites all have similar layouts and URL structures. Say you also have a set of landing pages that are nearly identical across all sites, like,, etc. If you have the same landing page template replicated across several different websites, how do you gather data that will allow you to optimize the user experience for all of those different sites and landing pages?

We at Flint Analytics faced this exact issue with one of our clients. Similar to the example above, the client had localized websites for each of their major markets and a slew of landing page variations for each product. The objective was to understand how users interacted with each landing page, and we used Crazy Egg to do this. The challenge was getting Crazy Egg to aggregate this information across multiple domains (Tampa, San Jose, etc) so that we could track each landing page type as a template and make UX changes across the entire program. Luckily, Crazy Egg has a wildcard option that allowed us to do this.

Crazy Egg Wildcards and Hostnames

Returning to our example company, let’s say we wanted to create a wildcard pattern that captured the homepage of all domains that we owned. Replacing each city with an asterisk (*) and using the pattern example*.com would capture all of our location-based sites, including,, and

Let’s say we also wanted to track the page path product-experience-1 across all of our different domains. The individual pages to track might be:

Here are the steps for setting up a Product Experience 1 Snapshot across multiple domains in Crazy Egg:

1. New Shapshot – On your Crazy Egg homepage, click +Add new and then Snapshot.


2. Snapshot URL – Any of our URLs above will do, but for this example we’ll choose The only thing that matters is that all of the URLs that you plan to track have similar layouts.


3. Snapshot Name and Device – It’s important to keep desktop, tablet, and mobile snapshots separated from each other. Here you can choose a naming convention that allows you to easily identify the Product Experience 1 snapshot for your chosen device.


4. Wildcard Option – Click the Advanced Options dropdown and then check Use a Wildcard.


5. Wildcard Pattern – Clicking Next takes you to a page that will allow you to enter your wildcard pattern. For our example, our variable is the city name and our constant is the page path, so our wildcard pattern would be example*.com/product-experience-1/.


Since we used the hostname in our Crazy Egg setup, the screenshot will show just that page. However, since all of our different hostnames use the product-experience-1 landing page, and assuming all formatting is the same across all hostnames, the aggregate clicks and scrolls will accumulate on the page so that we can make whole-program optimizations in user experience.

Note for AdWords users: If you are sending AdWords traffic to a landing page and you are using auto-tagging, make sure to add “?*” to the end of your wildcard pattern to account for any tags Google adds (i.e. example*.com/product-experience-1?*).


Photo credit:

Let us solve your analytics problem.