This is part II. Click here to read part I
In the first part of this blog series, we examined the role and methods of the Google quality raters. This is the team of people employed to conduct real searches and report back on the quality of the search results displayed by Google.
One of the main goals of the Google quality raters is to help their algorithm understand the context and intent behind every search. However, regardless of context, there are a series of single factors that can instantly earn your page the lowest-quality ranking.
Before delving deeper into the content or the context of the search, there are some pages that will automatically earn the lowest-quality rating for overall ‘page quality’.
Google quality raters are told to issue the lowest rating to any page that is not mobile-friendly. They define not being mobile-friendly as a page that becomes unusable on a mobile device.
Google makes no secret of their continued emphasis on the mobile experience. Not displaying correctly on mobile is now a cardinal sin. You can read how to make your mobile experience fully aligned with Google’s preferences here.
The next category of pages to be deemed lowest-quality are pages that are potentially harmful, dangerous, or deceptive. The search rater guidelines define these pages as:
´One final note: if the purpose of the page is harmful, then expertise doesn’t matter. It should be rated Lowest!´
Any publisher or content provider will welcome the efforts to clean up deceptive or fraudulent sites, or pages that damage the reputation of online media. Websites that contribute to ad fraud and harm advertiser trust will also not be missed by publishers.
By adding this human element, pages that could previously have ranked highly by meeting the algorithm’s expectations for ‘quality’ content, will be excluded from future search results.
The next category of pages that Google quality raters are keen to limit exposure to are pages that may have substantial content, but are actively promoting incorrect, or high-risk information. These pages first need to qualify as ‘Your money or your life’ pages. These pages can have direct consequences on users so Google applies the strictest criteria to these pages. You can read more about YMYL pages in the MOZ guide to the search rater update.
High-risk content is defined as:
In this example, we can see that a whilst the domain may have a high level of authority and the page is clearly a forum of shared experience, Google evaluates the page from the perspective of a user that has arrived via search.
The page is rated lowest quality because users could receive the page in a serach looking for serious medical advice.
Anecdotal, personal experience can still be classed as high-quality content but as the page contains a YMYL topic, Google automatically assigns the lowest quality rating. This is because the content contradicts established
There is a class of pages that may superficially address relevant topics, but don’t offer any meaningful content. This is where page quality has been ignored but is still displaying for some search results.
Google quality raters have set objective standards for this category of page so it is unmistakable.
Pages can also earn an instant low-quality rating due to the layout, organization, or structure of their content. Lowest quality pages have little or lowest quality content, copied or auto-generated content.
As Google puts it, if ‘The main content is obstructed or inaccessible, there is inadequate information about the website or creator, and finally, Unmaintained websites, and hacked, defaced, or spammed pages will all be marked as lowest quality. ‘
The primary purpose of a page should be to help a user reach their goal. If the adverts on a site overshadow this or impair the user experience, the page is rated as the lowest quality.
This example demonstrates how a page can just serve the purpose of delivering an ad with no user value. But, this is an extreme example. For publishers wondering if their ad set up is going to affect their rating, Google outlines some core rules for combining advertising with user content.
Ads should be clearly labeled and should never disguise themselves as the content of the page. Heavy monetization from Ads that distract from the content suggests the page is not trustworthy.
The page design should not it difficult to distinguish the MC from Ads and users should not have to fight to view the content because it is under a long list of Ads ormistake the Ads for content.
Finally, whilst the search rater guideline focuses on the content of the page, Google recently filed a patent for an external link scoring system that would feedback to the quality rating system.
The system pulls scores from multiple links that direct to the site, affecting the rating of the page based on the quality score determined by a ‘ranking engine’. In its patent application, Google summarized the process.
‘A link quality score is determined for the site using the number of resources in each resource quality group. If the link quality score is below a threshold link quality score, the site is classified as a low-quality site.’
In addition to the human rating system, there are also automated processes that will affect the page quality. This also links the rating with the quality external sites, beyond the control of the site owner.
As this is an internal process that does not directly affect rankings, page owners will not receive any indication if their page has received the lowest quality rating.
To site owners that think the latest core algorithm update will impact their pages, Google’s John Mueller suggests making content more relevant. The topic of improving the rating score was addressed in a Google Webmaster Central hangout. Mueller explained:
‘’It’s not [something] where we’d say ‘well something is broken and you just need to fix these five lines and then it’ll be back to normal.’ But essentially a matter of how can you show that you’re relevant for these kinds of queries. These are changes that take quite a bit of time for our algorithms to figure out.’’
By marking pages as low quality, Google aims to make the search results more finely tuned to the idea of user intent and user experience. Publishers creating quality content should be able to maintain high rankings with minimal, or minor tweaks to their processes. Improvements to the search process should result in more relevant traffic and less traffic that has to search again to find the content they require. In addition, these ratings do not currently affect rankings, they are only used to develop future changes to how results are displayed, likely rolled out in future core updates.
They should give publishers a more user-focused perspective when it comes to optimizing their pages to be as relevant to their audience as possible. The more informed publishers are, the better equipped they are to continue reaching their audience with their content.
Whilst these search rater guidelines are clear and openly available to publishers it also starts the process of Google having some editorial control over the content displayed. It makes sense to prevent harmful pages from displaying, but there is an argument that the neutrality of the Internet’s search and display system is being eroded by a subjective decision from one of the most powerful elements in the ecosystem.
In the next post in this series, we’re looking upwards to see when and why Google applies the highest quality rating. Stay tuned.