Panda

Almost monthly, Google rolls out algorithm updates (these are only those that are officially confirmed!). Not all of them have a significant impact on the results of search results and today we will help you figure out what algorithms have made significant changes to the ranking results in recent years.

Perhaps one of them has influenced or will affect the position and traffic of your site. After reading this article, you will understand what features of content, internal and external optimization, you should pay close attention to and how to adapt your site to new requirements for successful ranking on Google.

The Google Panda algorithm is used to detect pages with non-unique content, with content flooded with key requests, spammy or automatically generated content. The algorithm may also affect sites with duplicated information on many pages of the same site and sites with insufficient content. Such pages or sites are generally downgraded in Google’s ranking ranking.

Aanalize Initially, “Panda” was not part of the main algorithm, but worked as a Google filter, that is, it affected a certain proportion of sites with each update. But in January 2016, Panda was officially included in the main ranking algorithm. This does not mean that the algorithm now works literally in real time – there are still updates that affect a certain proportion of output. Just now they have become so frequent that they are not announced by Google.

On the one hand, this increases the chances of sites from the risk zone being faster pessimized by the Panda. On the other hand, it allows website owners who were previously affected by the algorithm to recover their positions and traffic much faster.

What is the Panda algorithm punishing?

  • Non-unique content (plagiarism)
  • Duplicate content on different pages of one site
  • Automatically generated content
  • Content spammed by key requests
  • Spammy content generated by users (for example, comments)
  • Insufficient amount of content on the page (in relation to ad units, for example)
  • Bad user experience

How to secure your site?

  • Check the site for uniqueness of content
    Even if you write all the content for the site with your own hands, do not neglect to periodically check the content for uniqueness, as we did in detail in this article: How to check the uniqueness of content – an overview of services.
    Online stores should take care of at least partial uniqueization of information in product cards. Let you not be able to change the specific technical description of the product, but to make unique photos and videos of the product, to encourage users to leave real reviews is quite real.
  • Check the site for duplicate content
    This is one of the most common reasons for downgrades. Due to the peculiarities of different CMS systems, the same content may be available to search engines at different URL addresses, which leads to the appearance of duplicate pages in the index. Unfortunately, the site owner may not even suspect such pitfalls in the structure of his resource. Specialized software such as Screaming Frog SEO Spider will help you find these pages. Scan your site and pay attention to pages with a duplicate meta tag title and H1 tag, they are potential candidates for duplicate content. Take measures to eliminate them or close them from search engine robots – you can use the canonical tag, a 301 redirect or the noindex meta tag.
  • Check the ratio of content on the page to outgoing links
    Rate your pages in terms of outbound links. Perhaps your project has pages with a large number of outgoing. In this case, it is very important to add more unique content to them, which will help to avoid the attention of Panda.