In the last few months, Google has been rolling few major/minor algorithmic updates, of which the most prominent ones, about which you must get a fair understanding are Phantom and Panda 4.2. Google Phantom While Google Phantom is not as massive as many past Google updates, it still affects a good percentage of search results. Reports suggest that some of the market giants have also suffered a huge negative impact from this update. All That You Need To Know About The Recent Google Updates Google Phantom primarily targets –
  • Thin or low quality content
  • Duplicate contents
  • Ad-heavy pages
  • Irrelevant and old comments
  • Low user engagement
  • Fonts that are hard to read and understand
  • Popups
  • Confusing navigation
  • Click-bait articles
  • Complete website ( due to many low-quality pages)
Google panda 4.2 Google panda was first released on February 2011 followed by the release of Google panda 4.0 on May 20, 2014, Google panda 4.1 on September 25, 2014, and finally we know have Panda 4.2 (released on July 18, 2015). The Google Panda 4.2 mainly focuses on boosting the quality content sites while pushing down the low quality content sites. Reports suggest that about 2-3% search results have been affected with this update. Those who understand Google’s update patterns believe that it is more a ‘refresh’ than a mere update. All those who have made the required changes on their website after they were hit by the previous Panda updates saw improvements in their rankings.
Google has released another Panda update. Although it is being called as a ‘slow rollout’, it is expected to hit 3-5% of search queries. It will specifically target thin or poor content ranking well in the Google search results. This update started earlier this week and will persist into next week. Opportunity for some, but problem for others! Websites hit by the previous versions of Panda stand a fair chance to get back in the business, if they have rectified on the concerns pointed out. On the other hand, it also brings the chances of leaving its impact on new website, which were not affected by Panda previously. If you notice an unexpected decrease in the traffic, most likely this new Panda update is the reason behind it. panda About that Number Panda 4.0 was the last update by Google, thus this one is the Panda4.1.
The newest algorithm by Google is out. The algorithm being called Pigeon (which is not its official name) rolled out a couple of days ago. Unlike other updates, this one is aimed at refining local search results. Although there are no reports that could say the number of queries that are affected by this update, but observation by SEO experts say it is a significant one. The update has impacted both Google web searches as well as Google Maps. Google-Pigeon-Update The types of websites that have gained maximum benefit from this algorithm are the ones that provide local directory listings. Quite recently, one such popular website, Yelp complained that Google was manipulating and controlling its search results and showing its own local listings before yelp pages, even when the users were specifically looking for yelp reviews for a particular restaurant or hotel. In one of the examples Google showed the official site of Gary Danko (a restaurant based in San Francisco) when a user fed in the query ‘Gary Danko yelp’. Along with restaurant’s official site, search engine displayed its Google+ page and links to similar associated content such as reviews. Now that Pigeon updated has rolled out, this problem has been fixed. It seems not only Yelp, but other directory sites like TripAdvisor, OpenTable and Urbanspoon are also benefitting from this update.
The release of Panda 4.0 came much as a shock and caught many websites off guarded. The update had biggest impact on press release websites, and several major names fell victim to this update and lost substantial traffic. Even names like PRLog, BuisenssWire, and PRweb faced a quick and substantial drop in their online visibility. With that, it became apparent which sites were winners and which ones were labeled as losers. But now the question is how to recover and undo the negative impact of this algorithm introduced by the Google. But for that, we first need to ask what was it about press release websites that caught the attention of this algorithm? Was there something about such sites that got them on Google Panda’s radar? Let’s figure out! Before we could answer this mystery, lets us discuss in brief what this algorithm is all about and what it entailed. What is Google Panda 4.0? The purpose of Google behind introducing this update was to make sure that readers came across only useful, informative, quality and unique content. It was meant to target websites that provided poor content and avoid them from appearing on top results. The first version of this update was rolled out in February 2011, and has been consistently refreshed from time to time ever since. The newest version of Panda is Panda 4.0. It came out on 21 May 2014. Each update hammered low quality sites making them suffer in terms of traffic and sales by dropping their rankings lower and lower with each release. The task of filtering down low quality sites and differentiating them from actually useful sites is done by Google Quality Raters. The Google Quality Raters work by determining a site’s quality based on the answers gathered directly from readers. One such question would be ‘do you think the website supplies accurate information’? Google Quality Raters provide a framework and a set of questions that describe what ‘quality’ means in the eyes of this leading search engine. All the websites that failed to perform well when questions (that assess the quality quotient), were asked directly from public were wiped out from top rankings in search results. If one were to confine the true essence and the purpose of those questions (and eventually Panda algorithm) in a few words, and what according to Google are the characteristic of a quality web page, then the words would be ‘true’, ‘value’, ‘originality’ and ‘user-oriented’. These elements are what Google Panda uses to differentiate who deserves high ranking and who will sink low. The USP of this update and what made it different from other algorithms is the fact, that it processed ranking results based directly and only on user experience. It works by deciding which sites deserved attention only as per user reviews. Why did Google Panda specifically target press release websites? Press RElease Google has always expressed how it feels about these websites. Especially it has never been happy with the sites that do not supply any actual and original information, and still use PRs for link building purpose. Matt Cutts also once stated that links in press releases do not contribute much in the SEO aspect. In fact, the Webster masters Guideline by Google asked webmasters everywhere that links in press releases should be removed. After the introduction of Panda 4.0 major names that publish PRs lost huge traffic. So what about all the companies that has been using PR wire services? Companies that have been releasing PRs without violating Google’s guidelines don’t have anything to worry about. Only the sites that do not publish any authentic, real, and useful information were targeted. Only the websites that published same information with slightly changed words and keyword insertion faced the negative outcomes. Google has always been working on refining its search results and directing readers to organic and natural destinations where helpful content is available for them. Which is why, everyone that tempered with Panda algorithm were washed out from search index. This is the reason all the sites that were still employing link building techniques and published redundant information with slight alteration of words; fell drastically in terms of ranking. How to recover from Google Panda and what lessons to learn from it?Improve quality of content Improve quality of content It is about time that PR websites work harder on being extremely cautious before publishing anything. Now there is simply no room for even redundant information let alone duplicated content. One way in which a site can regain its status is by integrating graphical presentation in PRs. Instead of relying on symbols and letters to explain the users what is being told, insertion of diagrams, graphics, and pie charts will aid user understanding and explain the information better. Not only will this give an impression of authority on subject, but it will avoid reproduction of same old data. Claim your authority on the topic! Distribute all the content topically. This is what all the websites did first thing after they were hit by Panda for the first time. Spread the content into sub-sections depending on which niche they belong to. Though it is not the best solution, but is definitely a part of it. Claim your authority on the topicWebsites that have a topical authority on content tend to gain better online visibility, as opposed to the ones that are generic and all-inclusive covering various topics without any organization. Most of the websites do not write content on its own. But that does not mean that a site can not earn authority just because of this thing. Most of the PR websites maintain some degree of editorial control before the information is published. To exhibit an authoritative outlook on a topic, emphasis on the use of editorial part, and make it noticeable by the users. The fact that a site intervenes in each and every piece of information will show that it does not randomly post anything that is provided, but analyze it first before supplying it to the end user. Winning people’s trust in terms of credit card usage One of the questions that were asked to users, which Google Panda gave a lot of importance, is if they feel confident and secure in terms of using their credit card on the website. This is one aspect where e-commerce websites naturally perform better. Many leading PR websites failed to earn a good answer when users were asked this question. trust in terms of credit card usageThankfully there are some ways to fix that problem: Survey says that people are readily willing to use their credit card on websites that are HTTPS enabled with a lock sign on the browser, meaning safe transaction. If a press release has a huge network of pages and offers e-commerce functions, it would be a good idea to make it HTTPs enabled. Next tip is to make the most of security solutions such as McAfee. This is a proof that a website is safe to use and share credit card information with. One more way to earn credibility and trustworthiness is to keep the design updated. Design is a major consideration and the one that seems old often gives an impression of lacking security. Many leading websites failed to earn people’s trust as to how comfortable they would feel doing online transaction, just because those sites were lacking in terms of good and responsible looking design.