Monday, February 23, 2015

Semrush: Keywords analyzing in websites with duplicate content

By Adrián Coutin

The Google algorithm updates, especially Google Panda, have increased the need for controls of the quantity and quality of content in a websites.

Google Webmaster tools are useful for controlling content processed by Google. However Semrush  provides important data that allow us to analyze Web sites affected by Google Panda helping us to work on solving penalties generated by this algorithm.

Semrush

Semrush is a useful tool for SEO and SEM professionals around the world. Semrush help us to identify main keywords of our niche market, the SEO situation of our competitors, among others relevant data. All these information is segmented for countries where Google offers its services. 

The geographic segmentation is a quite important feature of Semrush because the knowledge that we can obtain about our SERP positions in specific markets as relevant keywords for specific countries. All these data are quite hard to obtain from Google. More data about in Semrush

The Semrush main report of the domains can be seen in the graphic below. In this case the report shows the main data of the Portuguese version of a domain on Google Brazil.


  
In this panel we can see a summary of the main data provided Semrush over a domain in relation to traffic: organic search, paid search, backlinks, and keywords that send traffic, either from organic results or outcomes of paid campaigns.

Keywords - Google Panda

For the purposes of this post I will concentrate on traffic - keywords reports on a website penalized for various versions of Google Panda over an extended period of time.

The Google Panda penalization are connected with duplication of content which  generates a large number of keywords that position similar content of various Web pages in the same or very similar search results.

Suppose we seek a flight to Rome, using the keywords "flight to Rome" and found 3 or 4 pages of the same website in first positions. The objective of Google Panda is to avoid this kind of results because it is a manipulation of search engine results.  

For all these reasons we conclude that websites with duplicate content generate a large number of keywords and when Google Panda can detect duplicate content in a website affects the positions of its landing pages in  search results.

How Google affect websites penalized by Panda in search results? Does it reduce the total number of keywords or lose their competitive positions for search results?

Semrush helps us to answer these question. 

The graphic below shows us the behavior of the number of keywords in positions 1 to 20 in Google Brazil.


Duplicate content was generated and detected by Google in September 2013. This content began to be removed from March 2014 with the consequent reduction of keywords in Google Index. Many of these keywords from duplicate content, with little impact on traffic volume, were located in positions from 10 through 20.

Since the month of May 2014, noted in a black box, Semrush reported falls of keywords in Google Index with which it was confirmed that duplicate content were delete at Google.

At this point it is important to highlight the fundamental role of Google Webmaster Tools to check the contents reduction in Google Index, so I always recommend that tool.

Reduction keywords - Traffic

Is reduction keyword affected the relationship with the traffic?

Yes, reducing duplicate content and hence keywords, resulted in an increase in traffic due to the elimination or reduction of the penalty of Google Panda.


The graphic below shows us traffic reported by Semrush:



The falls in traffic connected to duplication of content are produced in February 2014 and the traffic is on permanent fall until the month of May 2014. On the contrary, the amount of keywords, graph of keywords, shows that in March of 2014 the domain had the highest number of keywords is reached in the Google index.

Therefore reducing duplicate content and thus of keywords with little traffic or low positions, favors positioning keywords that do generate traffic.

Semrush - Google Panda

Based on the above results I can tell the following conclusions on the use of Semrush to evaluate the results in Web affected by Google Panda:

  •  Semrush facilitates the work of detection increases / reductions keywords in the Google index by language and countries; information that is not readily available from other sources.
  • Using Semrush was possible to verify the elimination of duplicate content and its impact on Google indexes in Google Brazil (portuguese language).

  • The deletion of keywords with little impact created for duplicate content help to reach better optimization for those keywords with higher demand - more traffic.

  • Semrush give the opportunity to detect the increase in traffic associated with the elimination of duplicate content and thus the solution to Google Panda.
  • Regardless of the use of Semrush should point out that Google Webmaster tools is key to controlling the elimination of duplicate content and other actions related to the performance of Web sites on Google.

No comments:

Post a Comment