A new version of Webmaster guidelines was released a few days ago.
Some of the guidelines are modified, some are erased and some of them are entirely new. Even the experienced eye can easily miss these changes.
According to Google Search Console Help, following their Webmaster Guidelines will help Google find, index, and rank your site. The Webmaster guidelines are a set of suggested practices for creating and maintaining Google-friendly websites, provided by Google. This document is updated from time to time, but the latest update is different because it includes modifications, additions and even removals from the previous version.
You should constantly watch out for new changes in the Google Webmaster Guidelines to make sure you are always playing within the current rules. Moreover, if you notice that another website is breaking the rules or abusing guidelines, Google even suggests that you report that site. You can do it at https://www.google.com/webmasters/tools/spamreport.
Firstly, the new version of the Google Webmaster Guidelines document includes the General Guidelines section and Quality Guidelines section.
Secondly, only the General Guidelines section has been updated, unlike the Quality Guidelines section, which remained the unaltered.
Furthermore, the General Guidelines section is now divided into three parts:
Here is the breakdown of all changes that are made:
The new guideline in the section How Google Finds Pages:
“Ensure that all pages on the site can be reached by a link from another findable page. The referring link should include either text or, for images, an alt attribute, that is relevant to the target page.”
The old version:
“Make site with the clear hierarchy and text links. Every page should be reachable from at least one least one static text link.”
A static link is not the only way to make your site readable for search engine spiders and screen readers anymore. So, the phrase “static text link” is replaced with the more appropriate term “findable”. Also, when you link via images, beginning immediately, use an alt attribute for that image that is relevant to the page being linked to.
The second guideline now reads:
“Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page).”
The old version:
“Offer a sitemap to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.”
Site maps help your visitors explore your website. But Google takes improving user experience one step further recommending you to create a human-readable sitemap also.
The third guideline in the new version of the document:
“Limit the number of links on a page to a reasonable number (a few thousand at most).”
In the previous version Google stated that you should keep the links on a given page to a reasonable number, but now that number is more precisely determined.
The fourth guideline has remained the same:
“Make sure that your web server correctly supports the If-Modified-Since HTTP header. This feature directs your web server to tell Google if your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.”
This protocol tells Googlebot if your pages have been modified since it last crawled your site. If your server doesn’t correctly support the If-Modified-Since HTTP header, crawlers will grab your website and compare it to what was last indexed to see if there have been any changes. This can use a lot of your bandwidth.
The fifth guideline is only partly altered:
“Use the robots.txt file on your web server to manage your crawling budget by preventing crawling of infinite spaces such as search result pages. Keep your robots.txt file up to date. Learn how to manage crawling with the robots.txt file. Test the coverage and syntax of your robots.txt file using the robots.txt testing tool.”
The old version:
“Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it’s current for your site so that you don’t accidently block the Googlebot crawler. Visit Google Webmasters to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you’re using it correctly with the robots.txt analysis tool available in Google Search Console.
Besides the obvious change that Google is now recommending using robots.txt for crawl budget purposes, it is interesting that the part related to ensuring you aren’t accidentally blocking Googlebot is removed.
The first and the second guideline in this section Help Google understand your pages:
“Create a useful, information-rich site, and write pages that clearly and accurately describe your content.”
“Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.”
Make your website stand out by creating relevant and high quality content. Target the keywords and terms that are related to your business and the main theme of your website and include it in the website content.
The third guideline has a minor change compared to the previous version:
“Ensure that your <title> elements and alt attributes are descriptive, specific, and accurate.”
The term “specific” is added to the new version of this guideline.
The fourth guideline in this section is a part of the old version of guideline which we have already mentioned:
“Design your site to have a clear conceptual page hierarchy.”
The old version:
“Make site with the clear hierarchy and text links. Every page should be reachable from at least one least one static text link.”
The fifth guideline has remained the same, but with a slightly different wording:
“Follow our recommended best practices for images, video, and structured data. “
This guideline formally stated:
“Review our recommended best practices for images, video, and structured data.”
The sixth guideline stayed the same but also differently formulated:
“When using a content management system (for example, Wix or WordPress), make sure that it creates pages and links that search engines can crawl.”
The old version:
“If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.”
The seventh guideline only underlines the importance of CSS, and JavaScript when it comes to page rendering:
“To help Google fully understand your site’s contents, allow all site assets that would significantly affect page rendering to be crawled: for example, CSS and JavaScript files that affect the understanding of the pages. The Google indexing system renders a web page as the user would see it, including images, CSS, and JavaScript files. To see which page assets that Googlebot cannot crawl, or to debug directives in your robots.txt file, use the blocked resources report in Search Console and the Fetch as Google and robots.txt Tester tools.”
In the eighth guideline some URL parameter issues are added:
“Allow search bots to crawl your site without session IDs or URL parameters that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.”
The ninth guideline is an entirely new addition to Google webmaster guidelines document:
“Make your site’s important content visible by default. Google is able to crawl HTML content hidden inside navigational elements such as tabs or expanding sections, however we consider this content less accessible to users, and believe that you should make your most important information visible in the default page view.”
The message is clear – visible content is the one which counts and more important than the content less accessible to users.
The tenth guideline is very interesting because Google for the first time uses the term nofollow in the Webmaster guidelines:
“Make a reasonable effort to ensure that advertisement links on your pages do not affect search engine rankings. For example, use robots.txt or rel=”nofollow” to prevent advertisement links from being followed by a crawler.”
The previous version:
“Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google’s AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.”
The first guideline in this section is telling you explicitly to use a few words of ALT text
“Try to use text instead of images to display important names, content, or links. If you must use images for textual content, use the alt attribute to include a few words of descriptive text.”
The previous version:
“Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images. If you must use images for textual content, consider using the alt attribute to include a few words of descriptive text.
The second guideline is also slightly changed:
“Ensure that all links go to live web pages. Use valid HTML.”
The previous version was worded a little softer:
“Check for broken links and correct HTML.”
With this, Google wanted to emphasize the importance of using valid links and HTML compared to only checking it.
The new version of the third guideline:
“Optimize your page loading times. Fast sites make users happy and improve the overall quality of the web (especially for those users with slow Internet connections). Google recommends that you use tools like PageSpeed Insights and Webpagetest.org to test the performance of your page.”
The old version:
“Monitor your site’s performance and optimize load times. Google’s goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve.”
You can notice in the new guideline version that Google strongly recommends webmasters to improve page loading times and to use verified tools for testing page performance.
The fourth guideline is a brand new:
“Design your site for all device types and sizes, including desktops, tablets, and smartphones. Use the mobile friendly testing tool to test how well your pages work on mobile devices, and get feedback on what needs to be fixed.”
The importance of mobile friendly sites for SEO is already mentioned so many times, but thanks to this guideline it is official too.
The fifth guideline is almost completely the same as the old version:
“Ensure that your site appears correctly in different browsers.”
This is the old one:
“Test your site to make sure it appears correctly in different browsers.”
This is an entirely new Webmaster rule:
“If possible, secure your site’s connections with HTTPS. Encrypting interactions between the user and your website is a good practice for communication on the web.”
This guideline is straight to the point, so you should definitely make your website secure using HTTPS as soon as possible.
“Ensure that your pages are useful for readers with visual impairments, for example, by testing usability with a screen-reader.”
Getting SEO right is a big job. It’s also profitable and critical to your survival. Getting it wrong, is deadly. If your SEO protocol isn’t following the rules, it’ll not only cost you your hard earned rankings, it could get your site completely removed.
Want to improve targeted traffic to your site? Beat your competition in the local market? Build brand awareness?
Give us a call or shoot us an email. We’ll provide a complimentary website, SEO audit and strategy session to help you get clear on how your site is performing in the search engines and what, if anything, you can do to improve your results.