SEO Audit is a rather difficult task. It is necessary to thoroughly examine content, structure, technical features of a site and external factors contribute to it. Moreover, it is needed to identify weaknesses, plan and implement the necessary improvements and enhancements, taking into account the principles of search engines. You should regularly analyze and do subsequent optimization. For example, every six months, as far as search algorithms are constantly being improved, and many SEO tactics eventually stop working.
Preparing for SEO Audit
We recommend starting by scanning the site using any web crawler. It helps you to analyze the code, content, internal and external links, images and other elements of the site in terms of SEO and give a general idea of the situation. It is natural to take advantage of standard services search engine, they also provide a wealth of information. After this preliminary check, you can proceed to a deeper and more comprehensive SEO-audit.
Internal SEO Audit
The robots.txt file is placed in the root directory of the site and provides instructions for its indexing for search engine robots.
Using various directives robots.txt, you can:
- prohibit or allow robots to index certain sections and pages of the site or the entire site;
- specify the path to the sitemaps.xml, promotes proper indexing;
- reduce the burden of the search engine spiders on the site, if it is necessary to save crawling budget.
Thus, different rules can be created for individual search engines and even for different robots in the same system. Use every opportunity of robots.txt. Make sure that it is forbidden indexing “secret” areas of the site, pages of low-quality duplicate. Verify that you can access all areas of the site to index by search engines.
Instructions for search engines in meta-tags
File robots.txt is very useful for creating rules of conduct search engines to your website. However, for even more flexible control of the indexing of the site and its individual pages, directives can be placed for search engines in the meta-tags. This way you might make able or disable the robots that index specific pages and go to links placed on them.
File Sitemap is added to the root directory of the site and provides the search engines with such information:
- what pages of your site should be indexed;
- which of them are in the first place;
- how often they are updated.
Robots are not always able to find all the pages and correctly set the importance of each of them as compared to other. You should give clear guidance to ensure a complete index to it, especially if you are doing the optimization of an online store with a large number of goods or other major sites.
If your site already has a Sitemap file (preferably in the format of XML), check the correctness of its code with the validator. Make sure that the Sitemap file contains no more than 50 thousand URLs and is weighed no more than 10 MB. If these limits exceeded, you would have to create multiple Sitemap files and one index file Sitemap Index, listing all the cards. If a site has not the sitemap yet, create it manually or by using one of the many tools. For example, XML Sitemap and its analogs, plugins for WordPress and other popular engines.
After you create a sitemap, analyze it by a validator, which tells search engines of its existence through their services for webmasters, as well as by adding a path to the Sitemap file to robots.txt.
Assessment of the Quality of Site Indexing
Command ‘site:’ is usually administered in line search engines to restrict your search to a specific domain. However, this command also allows you to find the approximate number of pages that have been indexed. To do this, just enter the ‘site:’ with domain name without search words.
Compare the number of indexed pages to the total number of pages that you learned at the stage of creating a sitemap.xml and crawl your site. If both numbers are almost the same, your site is indexed correctly. If not all the pages are indexed, find out the reason. Perhaps, your site is not optimized for search engines, has a lot of closed sections and index pages or fell under the sanctions. In this case, the guide by the identification of sanctions and to overcome them helps you. If the number of indexed pages is much greater than the actual number, it is likely that your site has much duplicate content, which will be discussed later in this article.
HTTP Status Codes
HTTP status code is contained in the first line of response from the server when requesting a webpage and displays its current status. You must install when referring to how the site URL error message – usually with type code 4xx or 5xx. For example, the widely known: 404 means that the page is not found, and the code is 503 – which revealed that an internal server error. Code 200 says that everything works well.
To check the HTTP status codes, you can use different services.
URL of Pages
Good page URL has a length of no more than 100-120 characters, consists mainly of easy for reading words (rather than a meaningless set of letters and numbers) and contains keywords that describe the page. All this contributes do not only the best search indexing but also increase convenience for visitors.
It is important to comply with the recommendations and other SEO-specialists:
- try to avoid complex parameters and addresses to prefer static links;
- use to partition a directory structure of the site, not subdomains;
- separate individual words in the URL by hyphens not underscores.
Let us dwell on the choice between the links with the parameters of conventional and CNC. Search engines can index content with dynamic URL. With static URLs, the process of indexing is simplified and accelerated. In addition, you can embed a URL keywords, show visitors easy to read and easy to remember links, which give an idea about the content of the page. Such links usually have a higher CTR. By the way, you can adjust the conversion of dynamic URL to static on the server sides.
Internet users are impatient and leave slow sites. Similarly, the search engines have a time limit for processing each site, so fast sites are indexed carefully and in a shorter period.
Analyze the download speed. It is necessary to regularly apply built-in web analytics.
Audit of the Site Structure
The site should have a clear and logical structure of pages, ordered by category and closely interconnected internal links (for internal linking we’ll talk separately).
Avoid a large number of nesting levels. Let all important pages be located in one click from the main and other pages no longer than 3-4 clicks. Possible to design large site with a depth of not more than five clicks.
Such a flat site architecture allow search engines to index faster all the pages (and especially the most important ones). Moreover, it helps to visitors not get lost and quickly find the information that ultimately also have a positive effect on SEO.
On sites with great depth and that need to frequently return to back the current page to the general section, to which it belongs, we recommend creating additional navigation in the form of “breadcrumbs”. This is especially important for online retailers with a complex directory structure.
Internal linking contributes to a better indexing of the site and a reasonable weight distribution pages.
Try to install the pages of the site many links, while respecting the simple requirements:
- as an anchor use not only keywords but different neutral text, for example, calls to action such as “read”, “download”, etc. Such way makes the links juice more natural for the search engines while the abundance of keywords looks suspiciously;
- pages and keywords in the anchor text should be relevant to the content of landing pages;
- send more links to those pages that need to take a more prominent position;
- refer to these pages from the main page;
- place not too many internal links on a page.
Page titles are the first that people see in the search results and social networks and the basis of what they decide to visit your site. It is, therefore, important to pay special attention to optimizing titles.
Formulate headlines briefly. Try not to use more than 70-90 characters or title may be cut off in search results, social networks, and Twitter users cannot add a comment.
Headers office and various information pages (except for articles and other similar content-based products) must accurately describe their contents.
When writing articles also give preference to headers, that accurately convey their subject. Sometimes, in creative purposes, you can ignore this advice. However, in most situations, it makes no sense to sacrifice extravagance search engine optimization, usability for potential readers to quickly browse headlines on Twitter or RSS-aggregator and some traffic. Many people are too lazy to check what lies under your uninformative title.
Do not forget to add in the page header keywords, preferably near the top. However, do not cross the border reasonable, as is the case with all the page content, invent titles for people, not for robots. Make sure that all pages on the site have unique titles.
Page Description in Meta-Tags
Page description of meta-tag may be included in the snippet in search results, so you should take a responsible approach to the management of meta-descriptions of relevant pages. This tag, apparently, does not affect the ranking of pages, so it does not necessarily add keywords.
Create a description in a few words with a total length of 150-160 characters. It should be a coherent text that tells about a particular page, not the entire site, and not overloaded with keywords. Keep up to the description in the actual form. If the information on the page is refreshed, but the description is out of date, please make the necessary changes.
Forget About Keywords in Meta-Tags
For a long time, most search engines ignore the keywords in the meta tags, so it makes sense do not add this information to the site code not to give competitors the extra data on your SEO-strategy.
We must always remember the truth, has already become commonplace: you have to create content for people, not for search engines. Do not get too carried away by the search engine optimization, as it makes the text almost unusable for comfortable reading, and ultimately have a negative impact on the results of SEO.
Make sure that your web pages contain valuable for the audience unique content, but not over-optimized text and copy from other sites. Also, that amount of content on each page must be more than 300-400 words. There is evidence that ceteris paribus, a page with 2000 words and more are usually located on the search results.
Check the keywords on the text pages, especially in the first few paragraphs. Edit the text so that the use of keywords does not lead to a repetition of meaningless phrases, and written with the sole purpose to once again mention the keyword. Beautiful, harmonical, useful text, with the keywords invisible to the reader, is what you should strive for. Stuffing of content and meta tags with these words leads to sanctions from the search engines.
The text on the site should not contain grammatical errors. Illiteracy usually indicates about the unprofessionalism of the author, so that content can occupy lower positions in the search results.
If your site has content, that duplicates other site materials, the search engines are faced with a difficult task: to determine which version you want to index and display in search results. So this problem is exacerbated to the extent that other sites link to different pages or copies of the same page, which opens on different URLs.
Here are some causes of duplicate content:
- CMS website can do the same pages available on various links;
- The second part of the URL many sites generated dynamically contains additional parameters that vary depending on various factors;
- the content of a site often steal, placed on other sites without backlink, and the search engine is unable to associate it with the original;
- when visiting a website that can be created with a unique session identifier, which is used in dynamic URL. It should be, for example, for temporary storage of information added to the basket of goods up to an order);
- versions of pages that are optimized for printing, can be regarded as duplicates.
Solve the problem of duplicate content can be achieved by simply removing duplicates, create 301 redirects ban on indexing duplicate in robots.txt or meta-tags of individual pages, use the directive rel = “canonical” and other methods.
Organize text publications. Use headings (tag h1, the most important of them), several levels of headings, highlight text passages in bold italics. Thoughtful text, formatting greatly facilitates the reader perception of the text and headlines with significant words, also have a positive impact on SEO.
Need to comply with the measure: too many titles and keywords in bold, repel readers no less than monotonous unstructured text.
The best way to structure your content on the site is to think primarily about people and do some light “reverence” towards the search engines.
Analyze all important images on your site, thematically related to the surrounding text content. For the full indexing names of image files, as well as the value of the ‘alt’ attribute of the tag should contain keywords describing the image, and, it is desirable, keywords page. In this case, make different file name and value attribute ‘alt’.
The words in the file name stand separated by a hyphen, not underscore characters. To describe the attribute name should be used alt, but not title, and it should be brief (no longer than 150 characters), with all important words at the beginning of the text.
It makes no sense to add descriptions and give the file names, aimed at SEO, images that are not associated with textual content, e.g., graphic design elements.
Do not forget to provide fast loading all images posted on the website:
- remove all unnecessary field;
- set the size of the pictures not above the minimum required;
- optimize them in a graphics editor with built-in tools for preparing images for the web.
As mentioned, slow loading of pages harm SEO.
External SEO Audit
Often site owners attentive only to obtain the trust of backlinks. However, you need to take care of and outgoing links from your site to other resources.
Avoid links to poor quality sites and sites with a bad reputation because of negative effect for your position in the search results. If the placement of such a reference is unavoidable, use in the ‘href’ tag attribute ‘rel’ with the value nofollow to ban clicking on the link for the search engines.
Link to external content, thematically related to your page, and the text of the link, try to include words that describe your landing page.
Regularly check for broken links, which leads to errors in the form of code 4xx or 5xx.
Reference ranging still work, and necessary to take into account the number and quality of incoming links, even though, the trends talk about reducing their future values for website promotion.
Find out how many sites link to your site, whether they are related thematically to your site and whether they trust.
If you referenced by a lot of high-quality websites in the same industry, you would get high prestige. If these sites have a bad reputation, and contain low-quality content, the search engines are beginning to be regarded as such your site.
Try to acquire back links in the text that contains the dates of your keywords. In this case, it is necessary to comply with the measure: a large number of incoming links from other sites with the same or similar anchor text may lead to a decrease in the position of your site in the search for references in the text keywords. The total mass of backlinks to your site should look natural to the search engines. Let some links to be in anchor keywords, calls to action like “click here” or “read” for others or another neutral text.
Think about a new strategy of cooperation with the trust and thematically similar sites and organize the removal of backlinks from low-quality sites-donors.
SEO Analysis of the Competitors
An important part of audit is to examine the strategies and mistakes of your competitors in the field of SEO. Encourage you to read the Quick Start Guide of SEO-analysis of competitors.
Protection Against Negative SEO
Negative SEO combines a set of black and unethical methods and techniques used by competitors to reduce the position of your site in the search results or even ban it for the search engines. Here are some ways to achieve these goals:
- hacking and hacking of the site;
- the spread of thousands of spam links to your site via comments;
- create a clone of your site and copying content without your permission;
- redirect links to your site with keywords like “Viagra”, “online poker”, “porn”, etc.;
- creating fake profiles on social networks, through which extends a negative against your company and the site;
- removing trust links from other sites to your site.
Safety and security of your site are in your hands. To prevent negative SEO bring down position of the site in search results, use a series of simple but effective techniques:
- Create a list of e-mail alerts
- Watch out for a backlink to your profile
- Protect your site from viruses and hacking
- Check your content to duplicate
- Watch for references to the company on social networks
- Regularly monitors the availability of the site
- Do not use negative methods of search engine promotion
- Do not get inveterate enemies.
In conclusion, SEO audit not only allows you to make the site more friendly to search engines but at the same time increases its comfort and value for visitors. Provided competent and regular performance (at least twice a year) audit increases both search and other traffic. After all, smart and honest content optimization improves and enhances usability, people love good sites and come at them again and again.
Here are best SEO Tools to help you with SEO Audit: