What has changed in SEO for Google in 2012?

Flattr this!

Over the last 12 months (from November 2011 to November 2012) there have been significant changes in Google, which has affected the work of SEO-specialists.

The most frequently used words for webmasters have the reputation and trust. We are talking about the reliability of the site, design, external links, and not about reputation management in terms of SEO. Of course, Google, like other search engines, talked about the quality of the site and punished for bad SEO from the very beginning of its work, but right now it has never been better to identify the various violations. Google has emerged with sharp teeth.

Google has learned to speak
If the webmaster did not give Google any details it is considered spam on its site, but now the situation has changed. With search engine in April 2012 broadened the range of types and warnings that it sends with webmaster tools. Already in January and February 2012 Google sent more than 700,000 messages using webmasters tools.

Google introduced Penguin in April 24, 2012. Below is an illustration of how Pinguin looks in the analyst. According to Google, all of the sites filtered new algorithm should make great efforts to eliminate all artificial links, regardless of their age. In October, the search engine introduced the tool retraction links to help webmasters in this task, but it actually says more about links to the search engine than a help to quickly turn them off.

Even after using the tool retraction links, Google is not in a hurry to restore the position of the site. Google first wait to reindex all of the webmaster URLs, and only then will take action. It may take weeks or months between visits of crawler for poor quality pages.

It should be noted that sometimes Google ignores spam links. The site may have the link, but it falls under the Penguin only when exceeding a certain threshold. In this Penguin does not protect site from manual sanctions, even for sites that have already been filtered by the algorithm can be applied.


Google loves Panda, an algorithm that punishes webmasters for the poor quality of the resource. On November 18, 2011 Google Panda updated 13 times. The algorithm penalizes sites dropping them in search results. To deal with it is necessary to replace low-quality well-written texts, as well as to differentiate duplicate or very similar in content pages. For example, the websites of the companies with a large number of offices, creating a page for each office, which is different from the other contacts, and only has a very similar to other pages offices text.

Reward quality

Often it seems that Google is spending all his time in identifying resources of low quality, and are therefore particularly pleased to note the changes increase the quality of search algorithms in June-July 2012.


Back in April 2012, Matt Cutts (Matt Cutts), said: “In the next few days we will launch the new changes algorithms to combat spam. changes will reduce the ranking of sites, which, in our opinion, violate the Google quality resources. This sites do not look like an obvious spam at first glance, but expertise allows us to conclude that the sites use black SEO techniques and try to manipulate the search results ” . Changes must affect 3.1% of requests.


In March of this year, Matt Cutts told about upcoming updates, designed to combat reoptimization: “We’re trying to make GoogleBot smarter, we are looking for those who abuse the optimization, using too much of keywords on a page or a lot of links, but we will go much further than you expect ” – he said.

SEO-community does not know exactly what Google has in mind for reoptimization, but there are some assumptions. For example, too many keywords.

In November Matt Cutts talked about cross-references, and told us how Google sees it and it takes into account the keywords. Perhaps the cross-referencing and are part of the algorithm reoptimization. Below is an illustration of a general representation: one case is good, two is better, and three or four – is one too many, then each addition is less important, until you reach the reoptimization. At some point you begin to suspect optimization.

Due to the fact that you need not know the number of mentions of keywords and links for effective promotion in Google, helping to advance to just be natural and do not try to outsmart Google. It is noteworthy that for the links to your blog or subsidiaries Google will not penalize your site. But even if we will not be noticed reoptimization algorithms Google, the site can overtake manual authorization at any time.

Exact Match Domain

In September 2012, Google announced that it will fight the exact match domain name searches. Clearly, this is not related to Panda or Penguin, Google will deal with sites that are well ranked by the corresponding domain name keywords, not the content or external links.

Many ads at the top of the page

Sites that contain a lot of static ads at the top of the page and make the reader scroll down to see the main content of the site can be punished by its spider. This has no effect on a large sample of sites, Google notes that changes only 1% of the resources.

Links from infographics and guest blogs links

There is no accurate information, but in July Matt Cutts mentioned that sites that abuse links with infographics can be punished. It is about spam links with infographics, which actually deceive users.
In October 2012, Matt Cutts made a similar warning for blogs and guest bloggers (probably talking about spamn links to blogs). Blogs give excellent results in terms of promotion, but black blogging can lead to tragic consequences.

Pirate update

This is the punishment of the resources used too much content and violating DMCA requests (content that violates the copyrights).

Google Caffeine

Caffeine infrastructure was launched two years ago. In fact, we received strong toasted flavor Caffeine by Panda. And this year, the algorithms through links and excessive advertising increased Caffeine combined with improved indexing and more powerful database search engine and search engine for increased power.


A year ago, Google released an update called Freshness, affecting about 35% of search results. Next, the algorithm has been improved. The algorithm is more updated and has more recent information about recent developments, the most discussed topics.

Indexing AJAX and JavaScript

Google can now index AJAX and JavaScript. On the one hand, it allows you to read dynamically generated text, on the other – eliminates the ability to hide any information or link from a search engine using scripts.

Indexing closed links

Researchers have seen indexing closed pages and links

Automatic canonization of URLs

Under the canonization process is understood that the most appropriate URL (from several different URL to the same page) to optimize the page in search engines. Google said it will identify the main URL itself, however, the canonization of URLs is still available for webmasters.

Parked domains

Last year in December, Google added a parked domain classification by excluding them from the results. Had improved the ability to detect duplicate content domain.

Variety of domains in the results

In September, Google introduced an update that is designed to increase the number of domains in the search results. Sometimes Google results page is dominated by the same site, the update addresses this problem.

Tags and permanence issue

In conclusion we would like to focus on two trends that continue to develop, but are becoming more controversial. Such as tags, and persistence (immobility) SERP.

Google takes into account the tags in the search results and it makes sense to use them to optimize, here we are talking about computer-readable tags, and HTML elements and attributes. The use of machine-readable markings as help search engines perceive, categorize and display information. A quality display in search results allows you to significantly increase traffic to any site.

As for immobility SERP, Google has made some changes over the last year. For example, eliminated the left side panel, eliminating the possibility of unpaid search products. There are more requests, which shows local results, which is good for some enterprises, but very bad for others. For some queries is shown only seven results organic issue, and for some nine. The result may be multiple references to the same site. And no one has canceled advertising. The author believes that Google wants to simplify the search for the average user, while maximizing their own opportunities for advertising revenue. Moreover, Google wants to personalize search results with a focus on local businesses and friends by searching.

Tagged , , , ,