Search engines regularly update their algorithms and fight for the "purity" of an issue. In this regard, the situation, where we see in our Google Analytics, accounts (hereinafter GA) subsidence on organic search traffic, is not uncommon. How to identify possible reasons for the decline of traffic from the Google search engine? The first thing we need to do is make sure that the number of visitors has decreased because of the organic search Google. If the traffic dipped sharply from all search engines, it might be due to technical problems or seasonal decline. To do this, select the GA comparing periods before and after the subsidence traffic:
and do the following in GA 'Acquisition' => 'All traffic':
So, the traffic of "google/organic" dipped by 86%, suggests that the problem is with Google.
Despite the fact, that most cases lead to technical errors loss of traffic from all organic sources, there are situations when only Google reacts on some of them. Consider the basic technical errors, for which the site position would be reduced.
Here's how it might look like: User-Agent: Googlebot Disallow: / or User-Agent: * Disallow: / category / In the first case, the site is completely enclosed by indexing only the robot Google, in the second case, closed section/category/indexing for all search engines. Sometimes it happens by mistake and negligence. Always check the robots.txt.
Some CMS generate different takes, especially is often when using filters or search online. Also, very often the main page is available in several locations. For example, www.site.com and site.com or www.site.com and www.site.com/index.php. In this case, you must configure the 301 redirect.
Rarely, but there are problems with the 301-m redirect. As an example, after the change of the structure 301 configured URL redirect overlooked, leading to loss position and traffic. It can also happen that after the change of the domain and properly configure the 301 redirect it eventually flies. Should take care of the correction of 301 redirects (the server's response can be checked using the plugin for Firefox).
A large number of links, that lead to pages with 404 error, can affect a site's ranking. Need to get rid of these links. Through GoogleWebmastersTools (hereinafter GWT) you can see the number and find out the location of broken links.
Google reacts very negatively to slow the speed of loading pages (more than 5 seconds). Learn the download speed through GA:
Problems can be either hosting or site. Each case must be considered individually. In addition, the site could be down at the moment when GoogleBot had been coming, in which case you can get different messages in GWT:
If the problem is due to the server, you should think about changing the host.
In most cases, if there are errors on the site of internal optimization, reduction of traffic associated with the imposition of the filter Panda. It is a basic Google filter, which aims to combat low-quality internal optimization. Also, recently began to receive messages on the imposition of sanctions for manual aggressive spam examples spammed pages: If you get this message, it is enough to get rid of spam on the affected page. To define Google Panda filter today is difficult. Earlier this filter run on certain dates and if the drop in traffic coincided with the start time it means that Panda comes to your site. Today, this algorithm is integrated into the main index and runs every month during the 10-day period, but nobody knows when it starts and when it ends. Therefore, we recommend analyzing the errors inside page optimization, in which the traffic subsided, and compare them with those that have not changed or visit increased. The Site Position Could Be Reduced for the Following Error of Internal Optimization: Non-unique Content Today, Google gives this factor a huge importance. The text on the website must be unique. And if it is not unique, the site must be original. If the site is not the primary source, there are two ways out of the situation:
Close Variants After the last updating of Panda 4, it was observed that Google decreases the position of the site not only for direct doubles, but also for meaningful. If you have a website with close in meaning texts, we recommend using the attribute rel = canonical with indicating to the main page. Pages in Supplemental Results If the algorithm finds that the pages are unprofitable, it adds them to the so-called supplemental results, which are not visible in the main index. Typically, additional results are sent to a page with duplicate content. To check how many pages are in the main index possible be typing into Google the following structure: site: name.com/&, the overall index - site: name.com. Subtracting the primary index from the overall index, we obtain the number of pages in the supplemental results. Doubles Title And Description This is a classic mistake that should be avoided, since for Google correctly filled title and description are among the priority ranking factors. Too Much Advertising Advertising should not be more than the main content. In addition, blocks of advertising should not interfere with the perception of basic information. For a large number of advertising, Google can reduce site's ranking. Inconvenient Structure Poor usability, confusing navigation, uninformative first screen are also factors, for which you may get a filter Panda. Find doubles title and meta tags in GMT:
Keyword Stuffing Glut of text, title, and keyword meta tags are a direct path to the filter. In addition, frequent release by strong tag can also play in the direction of sanctions. If, during the analysis of the site you find a data error, the most likely site to apply filters Panda. Correct the error, and after some time, Google should return traffic.
Today, Google is an aggressive fight against unnatural reference mass. Loss of traffic due to sanctions for the reference profile is very typical. There are two types of sanctions: automatic (Penguin) and manual. Automatic sanctions are imposed during the update algorithm Penguin. If the date coincides with a drop in traffic, with high probability we can say that the site is superimposed by filter. You can view dates of all the updates on your website. The reasons for the imposition of sanctions are the same: - Purchased links; - In the anchor sheet; - an anchor prevails with the keyword; - Horse Racing reference weight (both increase and decrease); - Cross-cutting links from the template of the site (especially the footer); - Subprime donors spammed links and small attendance; - non-thematic donors. To remove manual sanctions is enough to clean a reference profile and submit your site for review. If the links do not violate the recommendations of Google, sanctions will be lifted. Automatic filter can be removed only after the next update of the algorithm. Drawdown can also be associated with changes in algorithms or an increase in competition, so stay tuned for information on the Internet and the issuance of the TOP. Thus, we examined the possible causes of subsidence traffic. To sum up, if you have a website dipped traffic, primarily to check it on manual actions and Penguin, i.e., obvious possible causes. If there is nor manual actions, nor Penguin on the site, you can proceed to the analysis of technical errors and further consideration of internal optimization of a website. Always analyze site in the complex. Perhaps, there are several reasons, but a full analysis leads you to success.