X
    Categories: Search Marketing

How to Conduct a Technical Site Audit and Boost Your SEO Efforts

There are three main aspects of a technical site review: architecture, technology and source code. The combination of these three elements being optimized for search will allow the content on your site to stand out from your competition. We will look at how to do a content review on your site in our next post. This post discusses how to do a technical site review.

With the burgeoning importance of Internet marketing, the Marketing Department and IT find themselves increasingly thrown together to achieve their goals. This is particularly so in search engine optimization (SEO), where Marketing needs to understand some of the boiler-room workings of Internet sites to optimize sites for search engine discovery. Merely understanding how to use keywords isn’t enough any more. Performing a technical site review requires the participation of both Marketing and IT in a true partnership. Marketers need to understand the elements and tools used to make websites sing—and what can make a site hard to crawl and result in a loss of search engine traffic.

The review should examine the site as others see it, keeping two key audiences in mind: search engines and directories (“spiders”) and people (prospects, customers, vendors, partners, shareholders, etc.). The purpose is to uncover anything that might block search engines, and result in Google reducing its evaluation of the site’s value, or result in site users being unable to do what you want them to do on your site.

Step 1: Crawl the Site to Check for Architectural Issues

The site needs a consistent layout and structure that emphasizes the content where you want visitors to focus. Where are the logical “buckets” of content? Sometimes it appears logical to have content in more than one place on the site. For example, if you are selling motorcycles and motorcycle accessories, it may make sense to you to have the information about Harley-Davidson T-shirts in the Harley-Davidson section of your site, and to repeat this information in your apparel section. However, Google doesn’t want to see the same content repeated in different areas of your site and will ding you for it.

Once the content is organized, you need to look at directory names. Every directory name has the ability to be scripted with keywords such as Motorcycle Helmets>Full-Face Helmets>Bell Helmets. Don’t let these descriptors get too long; you need them to communicate crisply. Longer descriptors may confuse or mislead, and usability is the guiding principle. The longer people spend on your site, the greater the value assigned by Google.

There are tools you can use to crawl the site and look at it the way Google sees it. One excellent tool is Screaming Frog, which has free downloadable versions for Windows, Mac OS and Ubuntu. The Screaming Frog SEO Spider is a small desktop program you can install on your PC or Mac which crawls websites’ links, images, CSS, script and apps from an SEO perspective. Screaming Frog will tell you where it has crawled and the pages it has found, reports 404s (missing pages), redirects, broken links, duplicate pages, and more. Moz.com is another great tool. These programs will allow you to gain a good top-level understanding of architectural issues, including how pages related to one another (or don’t).

Step 2: Review Your Technology (CMS, Flash, etc.)

There are hundreds of content management systems available, from WordPress to Joomlah! Most of these systems incorporate SEO best practices. They have a default set of built-in options that can help or hurt SEO, depending, so these need to be carefully examined before implementing on your site. You particularly want to be able to look at and modify the page titles and meta descriptions. For example, the generic installation of WordPress—one of the most popular content management and publishing systems—does not allow this, but a plug-in from Yoast allows you to categorize WordPress pages with unique titles.

Title tags are good drivers of SEO rank if they are composed well. Title tags get picked up immediately by crawlers and help the search engine to rank your page in comparison to others., In addition, a well written meta description that is concise and has a benefits statement in it, will help to increase the click-through rate when your site comes up in the search results. A standard but not very useful meta description might say, “Full-face motorcycle helmet,” while an effective meta description might say, “Full-face motorcycle helmet meets highest safety standard.” Do research around your content and make sure the metadata relates to it closely.

Some of the technologies used in building websites can prevent the site from being effectively crawled. Flash technology for movies is fine, but sites built on Flash cannot be crawled thoroughly. Spiders can read only the links, and search results are poor as a consequence. Also, some Flash pages can have one URL, but have three or four content pages associated with it. You need to be aware of this and associate one page of content with one URL in Flash.  A best practice is not to rely on Flash to build your entire site but to use it on a page-by-page basis to enhance sections of the site that will benefit from the interactivity of Flash. Additionally, many of Flash’s features can be offered in a much more friendly HTML 5 format that doesn't raise barriers to search engines as Flash does.

AJAX (an acronym for asynchronous JavaScript and XML) is a group of interrelated web development techniques used on the client side to create asynchronous web applications. They allow great interactivity, such as mousing over a photo for more detail, and allowing information to load in the background. This is great for user experience, but unless you code the page correctly, it can confuse and block search engines from fully indexing the content. Search-engine friendly AJAX is well understood and can be implemented by most competent web developers, but it often must be listed as a specific requirement when building a site.

The lesson here is to use these technologies as an adjunct to your main content, and only where they achieve a useful purpose, such as usability.

Step 3: Review Source Code

The average site has a code:content ratio of 80:20. This means that one in five “words” on each page is user-friendly content. The rest is “spider junk food,” and it doesn’t help SEO. Offload code segments into cascading style sheets (CSS) and Javascript, including files. Aim for a code:content ratio of 55/45 or better.

Incorporate key phrase-rich names for files and directories that mirror the content of the pages. This will increase Page Ranking by increasing relevance. (Remember, anything that makes Google’s search results more accurate will help your ranking.) Key words and key phrases should be used throughout the copy, including headlines, subheads and copy. They should be repeated about five times per page (taking care not to sacrifice the quality of your content). Make sure they are used in page titles, file titles, link text, ALT tags and metadata within the media files. (Content creation programs have a means of embedding key words and phrases within the files they produce.) This will assure that when users perform a Google search, the right words and phrases will be included in the brief description search engines provide for each site.

Be aware that search engines—Google in particular—take into account where your site is hosted. If your site is hosted on a server where your virtual neighbors are undertaking dubious activities like phishing, living in a bad neighborhood will taint your site ranking. Your hosting server should be above reproach.

Search engines also take into account how fast your site loads and how quickly it can be browsed. Consider if your host server is optimized for your site. For example, if your server is in Connecticut and you are selling surfboards to Californians, images and media files may take longer to load for users. One way to test this is to use Google’s PageSpeed Service. Enter any URL and Google will give you a ranking for it and offer suggestions to make it load faster.

Make sure all images are optimized for compression. Have scripts load content first and interactivity second. If you have audio or video files, host them remotely to make the site loads faster without them. As they are large chunks of code, they can seriously slow down a site. Use a content delivery network to load these files quickly.

Most websites tend to grow organically. Things are added over time. Website managers  come and go, and sometimes the “rules” you started with get lost in the shuffle. I recommend doing a technical SEO site review at least at least once a year or whenever you are undertaking major revisions to your site’s structure or content This will assure that broken links, 404s, misdirected links, orphan pages, old content and other detritus gets cleaned up regularly. It also assures that, like your car, your site gets a regular tune-up to continue to perform at optimum speed.

Ensure your website gets found in the engines.

Watch the tutorial—Incorporating Good SEO into Web Design—and learn the common design elements that can hinder SEO, and how to work around these challenges to balance design and SEO. Access it now FREE with a trial to the Online Marketing Institute. Activate trial now.