We have been using Varnish to cache our WordPress sites (such as this one and Tamutimes) for several years now. We feel that it has worked really well for our needs, adding a bit more speed to our sites to help with higher traffic days.
Varnish is highly configurable and is transparent to its backend so it doesn’t require any code modification on your web servers. The way we have configured our installation is to use RAM as the cache for our static content which can serve out requests very quickly. In addition, it saves on requests on the backend so those requests that do get sent to it get served out a bit quicker as well. All this makes for a snappier experience on the users’ end.
Another one of the great things about this setup is that we can also cache multiple sites on different machines. This means we can just put the necessary CNAMES on the varnish server and then point them to different machines as needed. The only thing required is a restart of the Varnish service to make the change. This helps a huge amount when we are migrating sites to new machines.
Overall we’ve been very happy with Varnish. For WordPress especially it has given us a little extra peace of mind that it can help out when there will be more traffic. I really encourage anyone to look into it if you run your own web servers and see if it could help you out.
The SEO research that we did this summer and which I posted to this site was never intended to be a “one and done” project. The intent was always to offer this information, and our team’s time, to other university and system offices in a series of presentations and individual consultations. I had been busy reworking the blog posts already online into a more focused presentation format, but a conversation yesterday showed me that I needed to announce this service sooner rather than later.
If any of you would like some help, whether it be just a simple conversation or a full-blown review of your site, please get in touch with me. This certainly applies to any of the colleges, departments, and divisions on campus, but we are also wanting to assist the system universities and agencies as well.
I attended the kickoff meeting for a new project last Friday — a digital center of excellence that is being organized by Mark Stone and Diane McDonald from the system offices. I am excited about this partnership because it solidifies a close working relationship between IT and marketing — and a mutual affirmation from both sides that the other is a critical part of our overall success.
The goal of this project will be to create an online resource, available system wide, that will allow us to share best practices, streamline the publishing process, and create cost efficiencies. The search engine optimization research that I did over the summer, for example, will likely be expanded and become a part of the content. The CoE will include not just web related information, but also social media, mobile apps, branding, contracting, and other areas relevant to online development.
We should be sending out a survey relatively soon that will offer an opportunity for feedback in which topics to focus on first.
This week was the first day back for our students. For those who want to join in and further your own professional education there are several opportunities being offered to university staff.
The Web Accessibility Essentials class offered through Employee & Organizational Development will be taught on September 8. This is a great opportunity for anyone who has not been exposed to accessibility but who need to know the basics. Designers, managers, and content writers will benefit as much as web developers. The corresponding advanced class will take place in November.
Another set of classes that could be useful is the project management training offered by the Project Management Office. Anyone leading significant projects should consider taking these courses. The PMO offers a series of classes, which can be combined together into a certificate program, throughout the year. There are two courses scheduled for September, one for October, and one for November.
The university also offers several programs for supervisors and managers, including three certificate programs.
My inbox is constantly being bombarded by third party organizations offering both free and paid webinars. If none of the university offerings meet your needs there is always something online that you can find. Whether you take one of these courses because your job requires continuing education credits, or just because you want to enhance your skill set, classes are now on session.
Yesterday we launched a new version of the Texas A&M Impacts site. The site was originally a republication of news articles from around campus that were collected and organized by theme. This content is being removed from its own site and will be re-absorbed into a redesigned TAMUtimes site (coming soon!)
The current site is meant to serve as a call-to-action anchor for the TV commercial that will appear during Aggie football games. The site was designed as a companion by the same firm that created the commercial. It is similar to the old site in that it serves to showcase selected Impacts taking place at A&M, but it does so in a more modern and media rich format.
In recent years, web designers have been discussing a concept called “designing in the open.” That is, letting other people see the website in development while it’s still being developed. This can mean an “open source” attitude, where anybody can chime in electronically. Or it can simply mean giving your client the URL of your development site so they can keep up with what you’re doing.
According to Brad Frost, losing the “Big Reveal” is one of the benefits of designing in the open. You’re not staking a month of work on whether your client likes what you did all month, or wants you to start over.
Basecamp’s Ryan Singer explains why he likes designing in the open:
Instead of asking for 10 changes and waiting a week, you can ask for 1 change and wait 15 minutes. Evaluate the change, praise it or identify weaknesses, and suggest the next change. By asking for small changes, you take the pressure off the designer because you aren’t asking for miracles. You also take the pressure off the review process because the set of constraints and motivating concerns is smaller. The design is easier to talk about because there are a fewer factors involved.
There are disadvantages to designing in the open, of course. When seeing a work-in-progress, clients may criticize the details instead of evaluating the big picture. That’s why many designers like to show clients black and white pencil sketches instead of the current State of the Website. When it’s obvious that they’re not looking at the final version, clients are less likely to ask, “Uh, you do intend to do this in color? Just checking.”
But if designing in the open became a habit, maybe your clients would get used to looking at the forest instead of the trees. Maybe they would learn to accept your ongoing project for what it is – ongoing. Maybe they would appreciate the chance to participate in the creation of their website while it’s being created, and not at carefully orchestrated intervals.
We have spent a lot of time looking at how Google ranks and returns out pages. The deference we pay to Google in order to increase our SEO gives them much broader influence than is at first obvious. Case in point, they recently announced that they will be adding HTTPS encryption as a (weak) signal in their algorithm. That is to say, they will favor sites connecting completely over HTTPS to those using HTTP. They say this affects fewer than 1% of all queries, but they also reserve the right to increase this “because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.” Whatever actually comes out of this move, it has already lit up the blogs, forums, and mailing lists.
Is this going to make us all start updating our servers to connect over HTTPS? Will they increase the weight to the point that we all see it as a “must have” in order to get our pages ranked? Is it really worth it (is it even a worthy goal?) in terms of administration and computing overhead if all we serve out is static HTML pages? However we answer any of these questions, we if nothing else can see Google’s influence even by the fact that we are asking them.
Suppose the Vestry Board of the Cathedral of Florence had come to Michelangelo in 1501 saying, “We have decided the sculpture of David should look like this,” handed him some sketches, bid him ‘buon giorno’ and went out to dinner together.
The problem: paper isn’t marble. Paper is flat, marble isn’t. A sketch can suggest what a sculpture might look like, from one angle. But a sketch isn’t a sculpture. A sketch can’t even become a sculpture, unless you turn it into paper-mache. No matter how well-thought-out the sketches may be, the artist has to create the sculpture from scratch every time.
Clients don’t always understand this.
- Some may schedule two months for management to discuss the website, and one week for designers and developers to create the website. As if making a website is just an afterthought when making a website. They talk as though the website is basically done once the developers receive the mockup or the copy. But at that point the website doesn’t yet exist.
- Some may expect the developers to start work before the client decides what images, words or even what purpose the website will have. Sometimes developers receive the content so close to the deadline, they are forced to start making the website without it.
- Some honestly don’t see web developers as part of the communications process. Web designers are mere decorators, web developers mere programmers. So they haven’t been included in the discussion about target audience and goals. The problem is that, every moment, web developers must visualize their target audience and make decisions on the best way to reach the site goals.
If sculptors make sculptures, it’s even more true that web developers make websites. My point is even more true of web design than it is of print design. A print designer is not just a technician, but you can treat him or her like one: “Here are two images. Combine them in Photoshop. Buon giorno.” You wouldn’t do that to your designer, but you could. And it might work, especially if your designer is as brilliant as Michelangelo and feeling equanimous that day. But you can’t do that to a web designer, and I’m not being persnickety. Here are three reasons why it literally wouldn’t work.
- We’re not Michelangelo. We’re flattered by your confidence in us, but we don’t know how to do everything. Web design requires imagination and problem-solving skills, but its tools are still limited.
- Screen sizes are not set in stone. Next year’s phones will have more pixels or a different shape. Last year’s phones may have fewer pixels. With dozens of common screen sizes in use, it no longer means much to to create a pixel-perfect imitation of a PSD on the Web. Which arrangement of pixels do you mean?
- Craftsmen must work within limitations. Even conference speakers and authors of web design books may not know how to do what you have dreamed up - it may not yet be possible on current browsers. Michelangelo had limitations too – he had to work with a block of stone that two sculptors before him had already gouged and carved on. And it took him three years.
As with everything, there is a dark side to SEO. So-called black hat organizations recognize the importance of search returns as well and will try to manipulate the system to get search returns pointed to their pages. Search engines getting good at spotting efforts to trick them, though. As a legitimate organization, just don’t do it – either yourself or by hiring a firm that practices these methods. These techniques will often get your site penalized and you will wind up worse than if you had not implemented SEO.
In its infancy, a search engine was little more than a system of looking up key words. The importance of a page to the topic was measured by the amount of times a key word appeared. If an article used the words “car” and “automobile” several times then the algorithm would give this page a high page ranking for those terms. This led to the obvious use of keyword cramming – simply using the key words over and over within the content of the page. Some early black hat techniques even added a whole section of nonsense text below the actual page content in order to fool the search engines.
After search engines defeated these crude attempts at manipulation, emphasis (even among white hat SEO firms) turned to crafting content so that your key work fit into the flow of the page but was still repeated multiple times. There were even formulas for how many times you could repeat a term without triggering a penalty for stuffing.
Today these techniques do more harm than good. Yes, you must use key words to get the content into the search index, but don’t overdo it. After the second or third use of the key word in a single page, the search algorithm adds incrementally less importance to it. Instead, focus on writing in a natural language for your audience. When we write for humans we still get our point across but usually use synonyms and other phrasings to get our point across.
There are a range of techniques for using content to deceive the search engines. Again, search firms are well aware of these and actively penalize this type of behavior. The most common deceptive practice is to try to serve different content to search engines than is served to real readers. Practices as simple as hiding content with style sheets (hiding text by using margins to position it off the page, for example) to more complex things like sniffing the user agent and serving different content to the search engines are all practices that are abused and should be avoided.
Similarly, black hats attempt to manipulate Page Rank by creating link farms and other methods of artificially increasing the number of pages linking to their site (another technique is comment spamming on blogs as mentioned previously.) It was common practice several years ago to create pages that were nothing more than a collection of links to the sites they wanted to promote. They would buy up dozens of domains and include this content on each of them. Many of these domains were typos based on the original site’s domain name, hoping to get people to accidentally (or through search links) find their site and click through to the original. Search engines recognize this type of site and now penalize instead of reward for it.
A similar practice is to host the same content on different sites, hoping to double the exposure. There were even legitimate reasons for doing this at one point – a domain name changed but the old one was kept as an alias to keep links from breaking, guest bloggers contributed the same article to many different sites, etc. Now it is seen as manipulation and does incur a penalty. (For situations such as the first example of a domain name change, you should set up 301 Redirects or at least use cononical meta tags to indicate the preferred address.)
While not inherently deceptive, the meta keyword tag has been so abused since the beginning of search that is it do longer even considered in creating page scores. Its primary use now is to include common misspellings and other things that you want the search engine to see – it is still read, it simply does not contribute value – but which for cosmetic reasons you don’t want your users to see.
Google is quite clear and quite consistent when it comes to its advice on search optimization. The best way of getting getting good placement for your page on the returns page is to write good content. That means writing content on a topic that people want to read and in such a way that the content is written for the reader rather then for the search engine. The term optimization itself implies making enhancements to content that already exists, not creating original content based upon it. Google holds strongly to this principle, believing that these are exactly the types of pages that are of benefit to users. The more beneficial the page is the more likely it is for them to create a link or share it on social media. These links are then what becomes the backbone of the site’s Page Rank.
This doesn’t mean ignoring SEO altogether. After all, optimization is an active process. Understand how search engines work and present your pages in the best possible light, but in the end understand that you are writing for your readers and not for a search robot.
As developers and designers who handle clients, we all have experienced the client asking for endless revisions. The Creative Bloq shares tips on how to handle these types of requests.
- Start with the intention to develop a healthy relationship with your client.
- Educate your client about the real purpose of a revision.
- Clearly define and articulate what is a round of revision.
- Clearly define how many rounds of revisions are included in your fee.
- Clearly define when change requests will be considered extra work and how this will be billed.
- Keep the client informed about each phase of the design process.
- Don’t forget to show your goodwill and flexibility.
- Accept that design is subjective
- Accept your mistakes.
- Put a stop when needed
- Don’t waste your time with the wrong clients.
For me, building a good relationship with your client is the most important step. Establishing that relationship will help with issues that come up during the project.