Basic SEO course – Chapter – 7

It would be impossible to find what a user is looking for, from hundreds of billions of web pages in records of Google’s index. Every day we have almost millions of new pages filled with new information, news, reports, products, services. As we know Google’s algorithm makes a ranking list for end-users for a given search query, it must add new capabilities to fight new levels of spam, new ways to stay smart, and efficient in identifying information quickly as well as most relevant.

From the old days till now, there are changes to improve the qualities and capability of algorithms to make sure, end-users get what they are looking for. Showing best results is the core service of Google’s search engine, the entire business module depends on this quality, at the same time webmasters always try to forcefully improve ranking by implementing techniques that can be called bad-thing-to-do, in simple words SPAM or Black-Techniques or Back-Door-Entry. To identify these bad guys Google implements new spam filters as and when it is required.

We can list Google’s algorithm change/update need broadly in three sections:
1. Fighting spam
2. Improving the quality of the search result
3. Innovating new techniques

Fighting spam
When the web revolution started, and the world was introduced to the new interactive user interface of the website loaded with text, image, and videos, we started using web directories and search engines to find the required information on the internet. All website owners wanted to be on top of these search engines and they started testing how they rank. The early day’s factor of ranking was the number of times a text was written on a webpage. Simple calculation of word count. This allowed web owners to pump extra text on the website without need, that was clearly spamming. And search engines innovated <Spam Filter Algorithim>, Next level was the number of websites/web-pages linking to a page <back-links>, again web-owners started creating artificial links, two-way, three-way, link exchange was popular to increase the number of linking pages. To fight Google had to improve the algorithm to identify the quality of page giving links, the growth rate of links, content-to-link, it was overall a new level of spam-fighting mechanism. All this was done to improve quality and filter spam.

Improving the quality of the search result
Search results must show what was asked for, Google’s algorithm adds new capabilities to improve:
a. The time taken in search and produce results
b. To read the meaning of searched words
c. To understand the intent of search
d. To identify the location of user
e. To read search settings of an individual user

This is an end-less process, they keep working to improve because their business module is based on this search-result-quality-factor. We have millions of new pages every day, we must have new ways to read them faster and filter them.

Innovating new techniques
Technology is improving every day, we have smartphones more capable than desktops or laptops, our life-style is changing, or purchase habits are changing, we use technology to find and search maps, we keep track of our visited location, we change phones more often. We spy or friends online, we track news online, do research on educational topics with help of internet. This ever-changing life-style demands new ways to get smart results for different browsable platforms, not only desktop but car desk, information kiosk, smart mobiles, tablets, public information displays, digital billboards. Hence Google does changes, to match the demand and sometimes to produce new innovations.

List of Google’s important updates
Even though Google changes the ranking algorithm almost every day, all are not visible in terms of SERPs, they are small tweaks but some of the updates change the world for many, many online business boom after these serious updates and some bite the dust because of penalties. We are listing some very important updates in recent years, well known to the SEO community.

The first one is PANDA, This was introduced to identify duplicate, thin, user-generated keyword-stuffed pages. Panda was introduced as a filter on February 24, 2011, and it was not part of Google’s ranking algorithm. In January 2016 it was officially clubbed with the ranking algorithm. Panda updates are more frequent now, hence you can expect penalties and recoveries faster. Panda assigns quality-score to each web-page and this score is used in the final ranking.
One needs to check the website using available tools for duplicate content, and take the help of expert SEOs to recover if facing ranking issues.

Penguin was introduced on April 24, 2012, to fight spam linking techniques, it checks the quality of page linking to you, filters the value of irrelevant links (off-topic-linking). The core objective is to down-rank pages with back-links that seems manipulative. Penguin works in real-time and it is part of the core ranking algorithm from 2016
One needs to check the link profile, keep track of all inbound links to make sure what is changing, even if you can not control who is linking to you, you have options to inform search engines about the same using Google’s webmaster’s account.

Hummingbird is to understand the meaning of search, what was searched by end-user. Hummingbird tries to understand the intent behind a search and shows pages answering those queries, not necessarily an exact match of a keyword. The answer on page is given importance and not the repeating words. It was introduced on August 22, 2013.

Pigeon calculates on-page and off-page factors, a user’s location plays a very important factor, it collaborates local and core algorithm to identify the best results. Pigeon was launched on July 24, 2014 (US); December 22, 2014 (UK, Canada, Australia). One needs to check all on-page quality checks and off-page linking methods to improve on this parameter.

Mobile update was launched on April 21, 2015, lack of page’s mobile usability downgrades ranking, webmasters must make sure of platform compatibility of a webpage top pass this Mobile algorithm test. Pages without mobile competence can be ranked very low or can be completely filtered out of SERPs.

RankBrain update is to improve Google’s Hummingbird algorithm, a machine learning system developed by Google to understand the intent of searched words. It was introduced on October 26, 2015. RankBrain is not well known to the public domain about, how exactly it works, but it is to identify relevant pages for the meaning of search query. According to Google, it is the third most important ranking criteria.

Possum update is for location-specific search results, it was launched on September 1, 2016. Search query and searcher’s location plays key factor. Business having multiple locations near can improve ranking. Apart from location, core ranking algorithms to plays role in ranking along with possum.

Fred is the latest Google’s confirmed update. Downgrading mostly blogs with thin, ad centric content. It was launched on March 8, 2017. Webmasters must review Google search quality guidelines and implement them to make sure their websites are not filtered with Fred. An ideal webpage should not have poor quality text written only for search engines, make your content for users, with answers.

Assignment: Make a new blog talking about Google’s algorithm, list them all on the home page, and have separate pages for each update, with details of the official explanation, list of tools to identify ranking problems. Link of downloadable tools that can help to identify each algorithm filter. Make sure you have supporting original images. Do not copy images from other websites, create original content. Mail this new blog link to your course cordinator.