skip to main content

TAMU Webmaster's Blog

Information and insight from the A&M Webmasters


Blocking Triggers Refined

In my last post about blocking triggers I showed how you can use exceptions to prevent Google Tag Manager from firing particular tags.  Because the HotJar tag contains the service ID value for that site we had to create individual tags/triggers for each site.

But what about when you have a more generic tag that is not linked to a particular service ID?  For example, you want to include an HTML or javascript snippet, but you still want to be able to control which sites it is or is-not displayed on.  You could create individual tags/triggers for each site as we did with HotJar, but there is actually a better solution.

The process starts the same, by adding a variable to your GTM.  This time, though, we will use the “Custom Javascript” variable type instead of the “Constant.”  The javascript we use then sets the URL of all of our sites that we want to exclude into an array.  It then loops through the array and tests whether the page being viewed is in that array.  If it is, then the function returns “true.”

At that point we can continue the same process that we showed for the HotJar tags.  Add a trigger that executes when the variable value returns “true” and then make that trigger an exception.  Now you are excluding that tag from firing only on the sites that you put in your javascript array.  To add or remove sites, just update the array in the variable and republish your tag workspace.

There are several other variable types. I have barely scratched the surface.  DOM Element looks particularly interesting.  This or another combination might actually solve the problem of needing duplicate tags for our HotJar implementation.  I will let you know if we ever get that far.


Tags: , ,

Wednesday, August 23rd, 2017 Analytics No Comments

Blocking Triggers in Google Tag Manager

In looking at how to optimize our sites download time, one common recommendation is to combine all javascript into a single file.  An even better piece of advice – if there is javascript that your site doesn’t need, don’t download it in the first place.

One of the areas that we identified as being bloated was how we ran our HotJar heat map implementation through Tag Manager.  HotJar tags require an individual id number corresponding to the site being monitored.  Since we have several dozen sites, that means several dozen tags.

The default for tags is to run on all pages.  This meant that all of our sites were firing off all of the HotJar tags, regardless of whether they were even for the proper site.  Realistically that might not amount to much, but I knew we could do better.

One solution might have been to create different triggers for each site, but that would have quickly become unwieldy.  After stumbling across an article talking about trigger exceptions I decided to go down that route.  I first created a variable “Block Sites” with the value of “block” (it could have been “1” or “true” or whatever.)  From that I created the trigger “Block All Sites” that consisted of the simple comparison “When Block Sites = block” (in essence, “when 1=1”.)   The key is to then add this as an exception to your exiting tag firing triggers.  So the tag fires normally if the exception is not present, but does not fire at all if the exception is present.

In theory, since exception prevents the tag from firing, your page never downloads and runs the code associated with the tag.    We do not continually run HotJar, so we can keep the exception in place until we are ready to start collecting data. Then all we have to do is edit the tag to remove the trigger exception and publish the new version.

I do believe there is another way that eliminates even this through the use of an all-javascript trigger, but I’m not myself adept enough with javascript to get all the pieces put together.  I will post at least the first part of the process in a later post that doesn’t include the requirement to match the HotJar id value.

Tags: , ,

Thursday, August 17th, 2017 Analytics No Comments

Webmaster Tools and Their Limitations

What tools can webmasters use to understand, as Monty once asked, who our “web visitors are and what they want to see”?

The Google Keywords Planner (under Tools in your Google Adwords account) can tell you what related keywords are most popular worldwide. Google Trends looks promising – compare search popularity by time and region for colleges or english majors. Google Analytics can tell you where your visitors came from, the keywords they were searching for (if Google feels like telling you), and what pages on your site are most popular. For search, Google Webmasters Tools may tell you even more. Your server logs can tell you some of those same things too.

But none of these tools can tell you about the people who have never visited your website.

So, what’s a webmaster to do? Blindly rewrite all our pages to use all the most popular keywords? Google Trends has a list of those too. Blast an email to every website that’s vaguely related to yours, and try to convince the webmaster that your site is closely related to theirs and needs a prominent link from their site?

I’m a little sensitive about link building requests, after recently receiving two emails from the same “link building marketer” on behalf of two normally-reputable educational publishers. He was asking for his links to be added to some of our web pages.  First to the Key Public Entry Points page, except that contains only A&M home pages. Then to one of our news releases, except that was released to the news outlets three years earlier. Apparently he didn’t actually read the pages first.

No, don’t turn your site into something that it isn’t, or pretend it’s something that it isn’t. That doesn’t mean that you shouldn’t let these tools serve as a wake-up call. If nobody is linking to your site, you probably need to start answering the questions that people are asking. If nobody is interested in what you’re talking about, eloquent and informative as you may be, you probably need to start talking about something new.

When our customers change, our products need to change too. When the world changes, departments should change. But a departmental website won’t succeed unless it accurately reflects its department. The department has to change first, then the website.

You’ll always be most successful by being yourself. For one thing, there’s no competition. Once you’ve decided who you want to serve – and who you can serve best – then give those people what they want to see. How do you know what they want to see? You have to enter their world. Or you could ask them.

Tags: , ,

Monday, June 16th, 2014 Analytics No Comments

March Madness – Lessons Learned, Part 1

We’re only one day out from winning the NCAA Women’s Basketball National Championship (we are the Marketing Department, get used to seeing that plastered everywhere) but some of the lessons are already apparent. Some things we did well, some things we didn’t. Such is life.

The first few games of the tournament honestly didn’t affect our site traffic very much. Even on the day of the Sweet-16 game the traffic wasn’t outside the bounds of a typical day. We did first see some increased traffic with the Elite-8 game against Baylor, but it was again not significantly outside the normal range. Perhaps playing another Big-12 school rather than somebody unfamiliar with us cut down on the amount of traffic from people curious to see who we are. With this game being in Dallas we did put extra effort in promoting it via the web calendar, and we did see lots of extra traffic hitting that site.

Sunday’s game against Stanford didn’t give us a lot of useful information. The traffic on was significantly higher than a typical Sunday (our least busy day in terms of traffic) but did not rise to the number of visits we get on most weekdays. It therefore wasn’t a good stress test for what we should expect from the finals game.

Coming next time… preparations for the championship game, results that were measurable, what we’d do different next time.

Tags: , ,

Wednesday, April 6th, 2011 No Comments

Cotton Bowl Daily Traffic

Anomalies always point out behavior that otherwise tends to be hidden in the general background noise. That’s why when we look at website analytics we look for events that stand out and then try to understand what happened.

The Texas A&M football team played in the Cotton Bowl on January 7, so it should come as no surprise that analytics reports showed a big traffic spike on that day. Examining that spike reaffirmed some of the lessons that we have learned in the past.

First, and unsurprising, national publicity leads to increased traffic volume. The football game was on national TV, in prime time, against a team that has significant national name recognition. Traffic on the mobile site doubled that day, and the main university site increased by even more. Given this many (mostly new) faces coming to the site, big events like this present a great opportunity to do something special rather than presenting them with your standard day-to-day content.

Once inside the site, traffic patterns were quite different from most days. Whereas academics and admissions are typically the most viewed sections of the site, gameday traffic was overwhelmingly focused on the “About A&M” section. After the index page, the next most viewed pages were About A&M, A&M Facts, and the university FAQ, then followed by Athletics. This trend is similar to what we noticed when the football game against Texas was on ESPN last Thanksgiving, and what Butler University noticed at the NCAA basketball tournament last year. In both cases the About page was the most visited on the site. For most of us this content tends to be low on our priority list, but as these instances show, this is precisely the area that can make a good first impression on people not already familiar with your school.

The moral of the story… big events provide big opportunities for marketing the university to an audience that isn’t already familiar with us. Most of the time events like this are known about well in advance, giving plenty of time to prepare. Do so… create something special for them… don’t waste the opportunity by presenting them the same stale content.

Tags: , ,

Wednesday, January 26th, 2011 Analytics, Miscellaneous 2 Comments

Web Traffic Patterns

In addition to our standard review of website analytics, this month I asked Michael to provide some numbers regarding traffic before graduation and after graduation.  I expected this to then give us the decline in usage that could be attributed to students leaving for the summer.  We did see this on some sites, but a collection of others was actually surprising.

Site Usage Change
Public-centric Sites
www -23%
search -20%
maps -22%
calendar -36%
president -13%
tamunews -37%
visit -6%
Campus-centric Sites
brandguide +24%
marcomm +15%
webaccess +48%
webmaster +10%

A handful of our sites meant for an internal audience showed an increase across the board. So what does this mean? One explanation might be that with the students gone the various departments across campus are focusing on updating their web presence and are therefore looking at our information more often. If that’s the case, and you have any questions or need any help, let us know, we’re here to help.

Tags: , ,

Tuesday, June 22nd, 2010 Analytics No Comments

Google Analytics, Possible Campus Collaboration

Last week Michael wrote a nice piece about how we can use Google Analytics to track user activity across subdomains.  This is a great new way of using Analytics, but it is limited by the fact that it still requires unique account numbers be added to each page, so effectively only works across subdomains that we individually own.

I’d like to make a proposition to campus web owners.  We would be happy to collaborate with you on campus-wide analytics tracking.  We can provide the code needed to add to your pages’ Analytics call, and in return will give you access to the Analytics console so that you can see your traffic with respect to

For some of you that might not matter since there might be little to no traffic being sent to your site from ours.  I can see a handful  of sites, though, where we provide a large number of links and where this information might be useful.

Anyone who is interested, please send us a note at and we’ll start getting things set up for you.

Tags: , , ,

Tuesday, June 15th, 2010 Analytics No Comments

Crossing domains with Google Analytics

Webmasters are resigned to knowing little about their visitors before they arrive and losing track of them as soon as they leave. But we’ve recently found a way to use the new asynchronous Google Analytics code to track visits between websites that you own.

To make the code work, you’ll need to create a custom “Display Subdomain” filter for your Google Analytics reports, one that adds the host name to the file name. Otherwise, for example, you couldn’t distinguish from
Subdomain tracking filter for Google Analytics

Sample cross-domain tracking code

This script will work for any site on the domain. Just edit it to include your own tracking code. Because it’s asynchronous, you can place the script just before the ending </head> tag on each page of your site, and then it can track visitors who exit before the page finishes downloading. The new asynchronous code also works on secure sites without modification.

<script type="text/javascript">
  var _gaq = _gaq || [];
  ['_setAccount', 'UA-your-tracking-code'],
  ['_setDomainName', ''],

  (function() {
    var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
    ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '';
    var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);

If you want to also track two or more sites separately as well as together, you’ll need to create separate profiles for each one, but using the same tracking code. You can reuse the “Display Subdomain” filter (above) for each of them, but you must create a second custom subdomain filter specifically for that subdomain. Here’s the filter we created for the university’s Web Accessibility website.
Cross-domain filter for Google Analytics
Let us know how it works for you, or if you have any questions.


Wednesday, June 9th, 2010 Analytics No Comments

More on Facebook Analytics

For those of you who might have started to update their Facebook pages to include Google Analytics as I mentioned a few posts ago, let me point out a followup article from the same folks at Webdigi.  This one gives a method that will allow you to track interactions and classify them according to whether they were made by fans or non-fans.

This can be an important concept because it allows you to get a firm picture of who is actually interacting with your page.  I haven’t added this to our sites yet, but I would be interested to know what the results would be.  This would solve once and for all the question of just how important a site’s number of fans really is.

Tags: , ,

Thursday, April 15th, 2010 Analytics, Social Media No Comments

Facebook Revisited

Last time I mentioned that I had given a presentation on Facebook to the Brand Council and that one of my points was to install Google Analytics and combine it with the information Facebook already provides to get a better idea of your site usage.  One of my other points was that you must actually look at and use this data rather than just passively collecting it and never looking to see what it shows. So here is a brief rundown of President Loftin’s Facebook site and the analytics behind it.

I didn’t get analytics installed for the launch of the site, which might actually be a good thing since the initial traffic spike of the first few days won’t skew the numbers.  While our total number of visits is in the acceptable range, the graph of  number of visits vs. time shows a clear pattern. Most of our traffic is generated on Tuesdays, the day that Dr. Loftin’s weekly report goes out.  The fact that each successive Tuesday has so far seen more and more hits is an encouraging sign.

Google Analytiscs chart of Dr. Loftin's Facebook page - # of vists vs. time in days

I’d like to go more in depth, but the fact that we’re still collecting the first round of data is hindering a  proper analysis.  As we collect more data I’ll get any trends that we find posted for your information.

To follow up on the gifts application that we added to Dr. Loftin’s Facebook page, I’m happy to say that it has been moderately successful.  In the week that it has been online 830 different Aggie gifts have been sent.  At least a few people have been heavy users of the application because 5 of the 11 hidden gifts have been unlocked.  By far the favorites have been the bow ties, so we will continue to add special bonuses to keep the interaction going.  This does, I believe, serve as clear evidence that adding ways of interacting with our social media sites increases their value.

Tags: , , , ,

Wednesday, April 14th, 2010 Analytics, Social Media No Comments