skip to main content

TAMU Webmaster's Blog


Information and insight from the A&M Webmasters

A New Take on an Old Site

August 29th, 2014 by Erick

Yesterday we launched a new version of the Texas A&M Impacts site. The site was originally a republication of news articles from around campus that were collected and organized by theme.  This content is being removed from its own site and will be re-absorbed into a redesigned TAMUtimes site (coming soon!)

The current site is meant to serve as a call-to-action anchor for the TV commercial that will appear during Aggie football games.  The site was designed as a companion by the same firm that created the commercial.  It is similar to the old site in that it serves to showcase selected Impacts taking place at A&M, but it does so in a more modern and media rich format.

Friday, August 29th, 2014 Ongoing Projects No Comments

Designing in the Open

August 26th, 2014 by Michael

In recent years, web designers have been discussing a concept called “designing in the open.” That is, letting other people see the website in development while it’s still being developed. This can mean an “open source” attitude, where anybody can chime in electronically. Or it can simply mean giving your client the URL of your development site so they can keep up with what you’re doing.

According to Brad Frost, losing the “Big Reveal” is one of the benefits of designing in the open. You’re not staking a month of work on whether your client likes what you did all month, or wants you to start over.

Basecamp’s Ryan Singer explains why he likes designing in the open:

Instead of asking for 10 changes and waiting a week, you can ask for 1 change and wait 15 minutes. Evaluate the change, praise it or identify weaknesses, and suggest the next change. By asking for small changes, you take the pressure off the designer because you aren’t asking for miracles. You also take the pressure off the review process because the set of constraints and motivating concerns is smaller. The design is easier to talk about because there are a fewer factors involved.

There are disadvantages to designing in the open, of course. When seeing a work-in-progress, clients may criticize the details instead of evaluating the big picture. That’s why many designers like to show clients black and white pencil sketches instead of the current State of the Website. When it’s obvious that they’re not looking at the final version, clients are less likely to ask, “Uh, you do intend to do this in color? Just checking.”

But if designing in the open became a habit, maybe your clients would get used to looking at the forest instead of the trees. Maybe they would learn to accept your ongoing project for what it is – ongoing. Maybe they would appreciate the chance to participate in the creation of their website while it’s being created, and not at carefully orchestrated intervals.

Tuesday, August 26th, 2014 Web Content No Comments

Google’s Widening Influence on the Web

August 18th, 2014 by Erick

We have spent a lot of time looking at how Google ranks and returns out pages.  The deference we pay to Google in order to increase our SEO gives them much broader influence than is at first obvious.  Case in point, they recently announced that they will be adding HTTPS encryption as a (weak) signal in their algorithm.  That is to say, they will favor sites connecting completely over HTTPS to those using HTTP. They say this affects fewer than 1% of all queries, but they also reserve the right to increase this “because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.” Whatever actually comes out of this move, it has already lit up the blogs, forums, and mailing lists.

Is this going to make us all start updating our servers to connect over HTTPS?  Will they increase the weight to the point that we all see it as a “must have” in order to get our pages ranked?  Is it really worth it (is it even a worthy goal?) in terms of administration and computing overhead if all we serve out is static HTML pages?  However we answer any of these questions, we if nothing else can see Google’s influence even by the fact that we are asking them.

Monday, August 18th, 2014 Miscellaneous 1 Comment

Web Design: Turning Paper into Marble

August 15th, 2014 by Michael

Suppose the Vestry Board of the Cathedral of Florence had come to Michelangelo in 1501 saying, “We have decided the sculpture of David should look like this,” handed him some sketches, bid him ‘buon giorno’ and went out to dinner together.
Michelangelo's David
The problem: paper isn’t marble. Paper is flat, marble isn’t. A sketch can suggest what a sculpture might look like, from one angle. But a sketch isn’t a sculpture. A sketch can’t even become a sculpture, unless you turn it into paper-mache. No matter how well-thought-out the sketches may be, the artist has to create the sculpture from scratch every time.

Let’s apply this analogy to websites. Fortunately, a web developer usually has some pre-written code and doesn’t have to start from scratch, but he or she still can’t turn paper into pixels. A paper design has two dimensions, a sculpture has three, but a website has four. It’s hard enough to represent a 3-D sculpture project with 2-D drawings. But it’s even harder to represent a website that will be animated with Javascript or CSS3 and viewed in any of 2,152 screen resolutions (I’m not making that up; that’s what Google Analytics says our visitors used last month). Time is the fourth dimension.

Clients don’t always understand this.

  • Some may schedule two months for management to discuss the website, and one week for designers and developers to create the website. As if making a website is just an afterthought when making a website. They talk as though the website is basically done once the developers receive the mockup or the copy. But at that point the website doesn’t yet exist.
  • Some may expect the developers to start work before the client decides what images, words or even what purpose the website will have. Sometimes developers receive the content so close to the deadline, they are forced to start making the website without it.
  • Some honestly don’t see web developers as part of the communications process. Web designers are mere decorators, web developers mere programmers. So they haven’t been included in the discussion about target audience and goals. The problem is that, every moment, web developers must visualize their target audience and make decisions on the best way to reach the site goals.

If sculptors make sculptures, it’s even more true that web developers make websites. My point is even more true of web design than it is of print design. A print designer is not just a technician, but you can treat him or her like one: “Here are two images. Combine them in Photoshop. Buon giorno.” You wouldn’t do that to your designer, but you could. And it might work, especially if your designer is as brilliant as Michelangelo and feeling equanimous that day. But you can’t do that to a web designer, and I’m not being persnickety. Here are three reasons why it literally wouldn’t work.

  1. We’re not Michelangelo. We’re flattered by your confidence in us, but we don’t know how to do everything. Web design requires imagination and problem-solving skills, but its tools are still limited.
  2. Screen sizes are not set in stone. Next year’s phones will have more pixels or a different shape. Last year’s phones may have fewer pixels. With dozens of common screen sizes in use, it no longer means much to to create a pixel-perfect imitation of a PSD on the Web. Which arrangement of pixels do you mean?
  3. Craftsmen must work within limitations. Even conference speakers and authors of web design books may not know how to do what you have dreamed up - it may not yet be possible on current browsers. Michelangelo had limitations too – he had to work with a block of stone that two sculptors before him had already gouged and carved on. And it took him three years.
Friday, August 15th, 2014 Web Content 1 Comment

SEO Report, Phase 2 – Techniques to Avoid

August 14th, 2014 by Erick

As with everything, there is a dark side to SEO. So-called black hat organizations recognize the importance of search returns as well and will try to manipulate the system to get search returns pointed to their pages. Search engines getting good at spotting efforts to trick them, though. As a legitimate organization, just don’t do it – either yourself or by hiring a firm that practices these methods. These techniques will often get your site penalized and you will wind up worse than if you had not implemented SEO.

Keyword cramming

In its infancy, a search engine was little more than a system of looking up key words. The importance of a page to the topic was measured by the amount of times a key word appeared. If an article used the words “car” and “automobile” several times then the algorithm would give this page a high page ranking for those terms. This led to the obvious use of keyword cramming – simply using the key words over and over within the content of the page. Some early black hat techniques even added a whole section of nonsense text below the actual page content in order to fool the search engines.

After search engines defeated these crude attempts at manipulation, emphasis (even among white hat SEO firms) turned to crafting content so that your key work fit into the flow of the page but was still repeated multiple times. There were even formulas for how many times you could repeat a term without triggering a penalty for stuffing.

Today these techniques do more harm than good. Yes, you must use key words to get the content into the search index, but don’t overdo it. After the second or third use of the key word in a single page, the search algorithm adds incrementally less importance to it. Instead, focus on writing in a natural language for your audience. When we write for humans we still get our point across but usually use synonyms and other phrasings to get our point across.

Deceptive content

There are a range of techniques for using content to deceive the search engines. Again, search firms are well aware of these and actively penalize this type of behavior. The most common deceptive practice is to try to serve different content to search engines than is served to real readers. Practices as simple as hiding content with style sheets (hiding text by using margins to position it off the page, for example) to more complex things like sniffing the user agent and serving different content to the search engines are all practices that are abused and should be avoided.

Similarly, black hats attempt to manipulate Page Rank by creating link farms and other methods of artificially increasing the number of pages linking to their site (another technique is comment spamming on blogs as mentioned previously.) It was common practice several years ago to create pages that were nothing more than a collection of links to the sites they wanted to promote. They would buy up dozens of domains and include this content on each of them. Many of these domains were typos based on the original site’s domain name, hoping to get people to accidentally (or through search links) find their site and click through to the original. Search engines recognize this type of site and now penalize instead of reward for it.

A similar practice is to host the same content on different sites, hoping to double the exposure. There were even legitimate reasons for doing this at one point – a domain name changed but the old one was kept as an alias to keep links from breaking, guest bloggers contributed the same article to many different sites, etc. Now it is seen as manipulation and does incur a penalty. (For situations such as the first example of a domain name change, you should set up 301 Redirects or at least use cononical meta tags to indicate the preferred address.)

While not inherently deceptive, the meta keyword tag has been so abused since the beginning of search that is it do longer even considered in creating page scores. Its primary use now is to include common misspellings and other things that you want the search engine to see – it is still read, it simply does not contribute value – but which for cosmetic reasons you don’t want your users to see.

Conclusion

Google is quite clear and quite consistent when it comes to its advice on search optimization. The best way of getting getting good placement for your page on the returns page is to write good content. That means writing content on a topic that people want to read and in such a way that the content is written for the reader rather then for the search engine. The term optimization itself implies making enhancements to content that already exists, not creating original content based upon it. Google holds strongly to this principle, believing that these are exactly the types of pages that are of benefit to users. The more beneficial the page is the more likely it is for them to create a link or share it on social media. These links are then what becomes the backbone of the site’s Page Rank.

This doesn’t mean ignoring SEO altogether. After all, optimization is an active process. Understand how search engines work and present your pages in the best possible light, but in the end understand that you are writing for your readers and not for a search robot.

Thursday, August 14th, 2014 Search Comments Off

Tips On How To Handle Clients That Keep Changing Their Minds

August 12th, 2014 by Rebecca

As developers and designers who handle clients, we all have experienced the client asking for endless revisions.  The Creative Bloq shares tips on how to handle these types of requests.

11 ways to stop clients asking for endless revisions

  1. Start with the intention to develop a healthy relationship with your client.
  2. Educate your client about the real purpose of a revision.
  3. Clearly define and articulate what is a round of revision.
  4. Clearly define how many rounds of revisions are included in your fee.
  5. Clearly define when change requests will be considered extra work and how this will be billed.
  6. Keep the client informed about each phase of the design process.
  7. Don’t forget to show your goodwill and flexibility.
  8. Accept that design is subjective
  9. Accept your mistakes.
  10. Put a stop when needed
  11. Don’t waste your time with the wrong clients.

For me, building a good relationship with your client is the most important step. Establishing that relationship will help with issues that come up during the project.

Tuesday, August 12th, 2014 Miscellaneous Comments Off

Broken Links on Your Website Send the Wrong Message

August 12th, 2014 by Michael

Broken links are a nuisance for everybody involved. They make your website appear ill-kempt. Google notices that, and lowers your search engine rankings. Part of my job is fixing broken links on our websites. I always try to find replacements, but if I fail, what can I do but remove the link completely? If your visitors can’t find what they’re looking for, they may stop looking for you. Or they may email you or call you instead, defeating the purpose of having a website.

When someone visits a web page that’s no longer there, your server sends a 404 message (not found). But what message are you sending to your visitors?

“We decided to move all our pages, and you want you to figure out where they moved to.”

“This website wasn’t important enough for us to update, so why should it be important to you?”

We don’t know what you’re asking about, and we don’t care.”

“We used to know a lot about this subject, but we’re clueless now.”

That’s not the message you mean to send, of course, since you’re not clueless. You’re still the authority in your field. (EDIT: Or else you could send a 410 message.) Maybe when you redesigned your website, you resigned yourself to some broken links. Maybe some of your pages are outdated, and you don’t want anyone to see them again. But you don’t mean to send the message that your website no longer has answers, let alone that nobody has answers, to their questions anymore. The problem is that other websites may have linked to your old pages, or your visitors have bookmarked them. Don’t believe me? According to Google Webmaster Tools, people were still looking for news stories from a ten-year-old version of the main Texas A&M University website. As of last week.

By the way, that Google Webmasters Tools link (above) is where you should start attacking the problem. Log into your Google Webmasters account (you do have an account, don’t you?) and click on the Not Found tab. Google has kindly listed all your broken links in order of priority – the ones that your poor visitors are still trying to find. Click on any URL, then on the Linked From tab in the popup box, and you will see the other web pages that are linking to your missing page. To automatically check for broken links each week, we use SiteImprove.

Some other solutions – your mileage may vary :

  • Preserve your pages – Sir Tim Berners-Lee said it and I still believe it: cool URLs don’t change. So when you set up your website, to avoid having to change news.html to news.cfm to news.php whenever you change your backend technology, make it the index page in a /news/ directory. If every page is an index page, you don’t need specify the file name or extension. So inside http://www.tamu.edu/admissions/, it doesn’t matter if the page is named index.html, index.asp, index.php, or index.jsp. It will be treated all the same by your browser. By using this technique (and an .htaccess file), we were able to move the President’s website from WordPress to Cascade Server without breaking any links.
  • Alias your pages – But what if your backend technology has changed and you didn’t set up your directory structure this way? With Apache configurations and .htaccess files, you can rename your .php files as .asp files. A useful tool against industrial espionage.
  • Stub your pages – Even if you changed your directory structure, maybe you can keep abbreviated versions of the old pages alive, at the old location, if they are still being visited often. Stub pages should quickly answer the most common reason for visiting the page, and conclude with, “For the latest details, visit our new page.”
  • Redirect your pages – You can do amazing things with server settings such as the Apache redirect directive or the mod_rewrite module. That’s what the WordPress .htaccess file uses. Your server can send the new page when visitors ask for the old page. You can do that with entire subdirectories.
  • Admit your lack – If you can’t fix the link, make sure your visitors see a helpful 404 page, not the default server error message. We created a customized 404 page that detects when visitors are looking for one of our most popular misplaced links, such as the Online Picasso Project, and directs them to the new location of that page.
  • Timestamp your pages – Especially on a blog or a news site, you can do this by literally adding the publication date near the top. You may not need to throw away a page, such as a bio, that is slightly outdated but has good information. When they see the date, visitors (and Google) will be able to judge how current the information is. If you’re not going to move the good information to a new page, don’t throw away the old one.
  • Update your pages – If your visitors have been going to the same page on your site for 15 years, do you really need to trash that URL and insult your visitors? Instead, keep the page alive and correct the misinformation. If you expect your visitors to get used to a new URL, it might take another 15 years.
  • Inform your referrers – Email the webmasters of the sites responsible for your highest priority crawl errors, giving the broken link, the page where you found it, and the updated link. I did this for a couple of dozen of our most common referrers. Google Translate helped, in some cases, to communicate with non-English-speaking webmasters.
  • Refer to your informersHospitality and customer service means that you don’t stop with, “It’s not my department.” The more often you’re asked about something that isn’t your job, the more diligently you should shout from the housetops, “I would be delighted to tell you whose job it is!” (Saves wear and tear on you. too). So if you no longer handle the topic that your outdated web page discusses, point your visitors to the best replacement page, where your visitors can find the quickest solution. Yes, it may be important to you that the Associate Provost for Agency Accommodation and Achievement only deals with researchers who were funded before 2010 by private foundations that are located in what was the Laurasia supercontinent during the Mesozoic period. But your visitors don’t care. You can link them to the right department, even if it isn’t yours, faster than they can Google for it. After all, you’re the authority in your field.

How do you deal with broken links on your website? Do you have any better suggestions?

Tuesday, August 12th, 2014 Web Content Comments Off

Visitor Center Registration Application

August 11th, 2014 by Erick

After a long, and often painful, process we are finally ready to launch Version 2 of the visitor center registration application. This was a project that was requested on my first day in Marcomm, and I am happy to have it now online.

Seven years ago the visitor center took registrations for campus tours over the phone and through email, and then manually recorded them in a Filemaker Pro database. This, of course, was an enormously cumbersome process which we were asked to streamline. Over the course of the last seven years we did this through a series of incremental upgrades.

The first iteration was to gather the information that was being fed into the Filemaker database and create an HTML file to allow people to register online. Filemaker was a desktop application similar to Access so we couldn’t write directly to the database. The form instead appended entries to a daily XML file that could be imported into Filemaker.

This system was a nice upgrade, but everyone recognized that it was not a complete solution. We therefore took on the task of creating an application that could feed a MySQL database in real time. At the same time the scope of the project widened as we were asked to not only include registrations for campus tours but also for meetings with academic advisors, freshman orientation meetings, and tours of residence halls. This project led to our previous site. It was well intentioned and did its job, but the development process was flawed. Changes were made throughout development that forced us to create a patchwork solution that left some gaps in usability.

The current site came about as a way of finally creating a unified system that was planned from the beginning to be modular and extensible enough to incorporate change requests during development. It also added options for both individual visits and group visits, which had been lacking in previous generations.

While this site currently exists as a stand-alone entity, so do have plans to make it more tightly bound — at least stylistically — to the larger visitors website and the common university branding.

Monday, August 11th, 2014 Ongoing Projects Comments Off

SEO Report, Phase 2 – Outside Influences

August 8th, 2014 by Erick

Social media, blogs, the Knoweldge Graph

Google recognized the importance of social media several years ago and has purposefully made it a larger and larger factor in factoring page rankings. Several factors play into this emphasis. First, because of their nature, social media channels tend to be regularly updated with fresh content. We have already seen that fresh content even in traditional web pages is part of the ranking system. Also, social media is very personal to its users. People write about and link to things that are of interest and importance to them. When people click a “share this” icon on a web page they do so because they saw something of value in that content and thought it important enough to share with their social connections. This, then, aligns perfectly with the search engine’s goal of providing information that is relevant and useful to their own user.

Social media channels have evolved quickly over the past several years. When search engines started emphasizing social content most people left their profile pages open to the public, making it easy for the search spiders to find and index the content. In today’s environment of increased emphasis on privacy, most people have their profiles set to allow only friends to view their content. This makes it increasingly difficult for search engines to get to the high-quality content they are looking for.

Of all the social channels, Google+ is quickly becoming the most important in terms of SEO. The fact that this service is owned by the dominant search engine company probably is not a coincidence. Almost everything Google does is related in one way or another to their core focus of search, even if that relationship is not easily visible to the casual observer. The way they have structured the Google+ service, then, unsurprisingly is optimized for getting information into the search engine.

Most blogs and social sites understand how search engines work in terms of using links on their pages to add “juice” to the link’s importance. These platforms have made a conscious effort, in part because of past abuses, to limit this and mark their outgoing links as “nofollow” or “noindex.” This tells search engines not to count this link in the Page Rank of the target page. In other words this link adds no value to the target’s search ranking. Google+ is different. Links there are DoFollow, which means that search engines will include these links in calculating the target’s ranking. This immediately makes Google+ more important than other platforms in terms of your SEO strategy.

Beyond simple likes (plus-ones in Google+ terminology) the primary benefit of actively managing your Google+ account is localization. Localization is a huge emphasis for search engines right now. They understand that to give you the most relevant returns for your search query the returns should be local to your area. A search for “restaurants,” for example, should show those in your area rather than a nation-wide return. Google+ is directly tied to Google Maps, so it lets you manage your location information yourself rather than being dependent on some outside factor or agent.

There is also a pilot project to link Google+ accounts to authorship of articles on the web. Author information on blog posts or news articles, for example, would have identifiable information about the author. Influential writers could then gain additional exposure. Google has been making a lot of changes to this program, such as no longer including the author’s photo on returns pages, so it is still too early to say how important this mechanism will be.

Google+ is also vitally important to your overall organizational recognition. Whenever someone searches for your organization the return page will often show a Knowledge Graph (the contextual information that shows up on the right-hand column.) For a well used account, the Knowledge Graph can return links to your Google+ account or even the content of a recent post. See figure below for a real-life example.

Screenshot of the Google Knowledge Graph for the search "Texas A&M University"

Blogs can be important, if they are configured to be search engine friendly. To be a successful blog the content must be kept fresh, which we know is important to search ranking. As a blog becomes more popular and reaches more people its affect can grow. Configuration is the key, though. Blogs must be set to allow “juice” to flow to the links’ target pages. The bad guys know this, though, and so target blogs for comment spamming – posting links to their sites in comments hoping to gain Page Rank from the blog site. The comments section, then, must either be moderated to not post these comments or have the site code written so that comments are marked as NoFollow.

Given the large, and increasing, importance of video, YouTube is a vital SEO element. While we don’t think of it as such, YouTube is actually the number two search engine in the world. According to recent reports it gets more searches per month than Bing, Yahoo, Ask, and AOL combined. Given that Google owns YouTube, and we have already mentioned how everything they do is tied in to search, this should be another primary focus of an SEO strategy. Optimizing your YouTube account, then, will increase your standings in both the #1 search engine (YouTube videos are included in standard Google returns) and the #2 search engine, YouTube itself. This can be a powerful advantage over your competition. Most firms have Facebook and Twitter accounts and put their primary focust there. Fewer firms have YouTube accounts and even fewer of those put real effort into making it a central part of their SEO (or even social) strategy.

While not quite “social media,” Wikipedia and similar open databases are still part of the Web 2.0 framework and are also critical parts of SEO. Wikipedia is one of the most popular pages on the Internet and has become the first choice of online information about a topic. Making sure that your Wikipedia page is accurate and up to date, then, is important in terms of protecting your brand reputation. Wikipedia does implement a NoFollow policy on its articles, so links from that site do not add to our site’s Page Rank. Do not, then, try to edit your organization’s entry to add links and hope it will boost your SEO (abuse of this in the past is probably why the NoFollow policy had to be implemented.) Content of the entry, though, is vitally important. See the figure above and notice that the text written in the Knowledge Graph comes straight from Wikipedia. Since Wikipedia is also on the first page of returns for most searches you can get double exposure for your message.

Related to Wikipedia, and also used by the Knowledge Graph is Freebase. Freebase is an open directory (owned by Google, so keep in mind what that implies) that allows the public curate information about a range of topics. Much of the information that gets posted in the Knowledge Graph comes from Freebase (in the screenshot above this would include address, mascot, enrollment, acceptance rate and colors, which are all prominent features of the Knowledge Graph.) This gives us the ability to directly affect the Google returns page regardless of the pages that are included in the results.

Friday, August 8th, 2014 Search Comments Off

The Webmasters Twitter Account

August 7th, 2014 by Michael

In the past year or so, we have become much more active on Twitter, tweeting once or twice a day since April 2013. We’ve focused on sharing the most helpful links and trends, whiles sometimes describing “a day in the life of a university webmaster.”

If you follow us at @tamuwww, you’ll find brief announcements and tips that don’t rise to the level of a full blog post. The 140 character limit is working well for us – often it doesn’t take much to share the basics of a new idea. Personally, keeping up with our Twitter account has kept me in touch with some of the best thinking in the web development industry, providing new tools to solve problems for real people.

Thursday, August 7th, 2014 Social Media Comments Off

Categories

Archives