Google Webmaster Guidelines: All you have to Know & Understand

Google Webmaster Guidelines: All you have to Know & Understand

SEO professionals use Google’s Webmaster Guidelines to choose tactic and strategy that work with Google’s prospect. Here’s what you need to know.

Google Webmaster Guidelines: All you have to Know & Understand

SEO professionals use Google’s Webmaster Guidelines to choose tactic and strategy that work with Google’s prospect. Here’s what you need to know.

Google Webmaster Guidelines are essential to achieve sustainable SEO results using strategy that are in line with Google’s expectations.

If you don’t follow webmaster guidelines, you can suppose to experience either an algorithmic devaluation or an outright manual fine.

In the most critical cases, you can suppose to be banned from Google SERPs completely.

Understanding the guidelines in detail is the simple way to avoid failure and future damage to your website. Obviously, there is always a new way to understand their guidelines.

For the most part, the guidelines are cut-and-dry.

But, you do need to study the guidelines on a regular basis because Google does update them. They don’t necessarily announce changes to them, so you must be updated regarding the changes.

One of the recent changes to the guidelines was when Google added a line about making sure you code your sites in valid HTML and CSS, and that it should be validated with the W3C.

This doesn’t mean that it will help in search (it can through a user experience perspective due to better cross-browser compatibility).

This is a different type of guide than most.

Objective of this guide is to offering possible solutions to usual problems, so you must equipped with actionable guidance you can use on your next website problem.

What Are the Google Webmaster Guidelines?

Given guidelines are separated into four ways:

  • Webmaster guidelines.
  • General guidelines.
  • Content-specific guidelines.
  • Quality guidelines.

Webmaster guidelines 
are the common finest practices that will permit you to develop your website so that it’s easier to become visible in Google Search.

Other guidelines consist of those that will help your site from appear in search engine.

General guidelines are the finest practices that will assist your website appear its best in the Google SERPs (search engine results pages).

Content-specific guidelines are more detailed towards those different types of content on your site like images, video, and others.

Quality guidelines include all those techniques that are banned and can get your page prohibited from the SERPs.

If that were not sufficient, using this technique can also cause a manual action to be raise against your website.

These guidelines are focused on making sure that you do not write spammy content and that you write content for humans rather than search engine spiders.

Creating websites that adhere to Google’s Webmaster Guidelines is a challenge.

But, by understanding them totally, you have passed the first obstacle.

The next obstacle is to apply them on your websites in a method that makes them compliant with these guidelines. But, as with any SEO challenge, practice makes perfect!

Make Sure Your Keywords Are Relevant

Your site should be easily available on search engine. One way to determine keyword relevance is to look at the top-ranking sites and understand the keywords they are using in their page title and meta description.

Another way is to do a competitor analysis of the top-ranking sites in your business category. There are several tools that can help out you to recognize how sites are using keywords on-page.

Common Issue

You have a site that has zero keywords. The client has given you a list of keywords, and you want to optimize the site for those keywords. The problem is, you have nothing but branded keywords throughout the text copy, and there has been zero thought given to the website optimization.


The solution here is not quite simple. You would need to perform keyword research and in-depth competitor analysis to find the sweet spot of optimization for keywords based on your target market.

Make Sure That Pages on Your Site Can Be Reached by a Link from another Findable Page

Google recommends that all pages on your site have at least one link from another page. Links make the world wide web, so it makes sense that your primary means of navigation are links. This can be done through your navigation menu, breadcrumbs, or contextual links.

Links should also be crawl able. Making your links crawl able ensures a great user experience, and that Google can easily crawl and understand your site. Avoid using generic anchor text to create these links, and use keyword phrases to describe the outgoing page.

A siloed website architecture is best because this helps reinforce topical relevance of pages on your site and arranges them in a hierarchical structure that Google can understand. It also helps reinforce topical focus.

Common Issue

You run into a site that has orphaned pages everywhere and within the sitemap.


Make sure at least one link from the site links to every other potential page on your site. If the page is not part of your site, either delete it entirely or no index it.

Limit the Quantity of Links on a Page to a Reasonable Number

In the past, Google is on record saying that you shouldn’t use more than 100 links per page.

It’s better to have links that are useful to the user, rather than sticking to a specific quantity. In fact, sticking to specific quantities can be harmful if they negatively impact the user experience.

Google’s guidelines now state that you can have a few thousand (at most) on a page. It’s not unreasonable to assume that Google uses quantities of links as a spam signal.

If you want to link over and over again, do so at your peril. Even then, John Mueller has stated that they don’t care about internal links, and you can do what you want.

Common Issue

You have a site that has more than 10,000 links per page. This is going to introduce problems when it comes to Google crawling your site.


This depends on the scope and type of site you have. Make sure that you reduce links per page down to less than a few thousand if your site needs it.

Use the Robots.txt File to Manage Your Crawl Budget

Crawl budget optimization is an important part of making sure that Google can crawl your site efficiently and easily.

You are making it more efficient and easier for Google to crawl your site through this process. You optimize your crawl budget in two ways – the links on your site and robots.txt.

The method that primarily concerns us at this step is using robots.txt for crawl budget optimization. This guide from Google tells you everything you need to know about the robots.txt file and how it can impact crawling.

Common Issue

You run into a site that has the following line in its robots.txt file:

Disallow: /

This means that the robots.txt file is disallowing crawling from the top of the site down.


Delete that line.

Common Issue

You run into a site that doesn’t have a sitemap.xml directive in robots.txt. This is considered an SEO best practice.


Make sure you add in a directive declaring the location of your sitemap file, such as the following:


Create a Useful, Information-Rich Site & Write Pages That Clearly and Accurately Describe Your Content

As their guidelines state, Google prefers information-rich sites. This is dictated by industry, so a competition analysis is critical to finding sites that are considered “information-rich.”

This “information-rich” requirement varies between industry to industry, which is why such a competition analysis is required.

The competition analysis should reveal:

  • What other sites are writing about.
  • How they are writing about those topics.
  • How their sites are structured, among other attributes.

With this data, you will be able to create a site that meets these guidelines.

Common Issue

You have a site that is full of thin, short content that is not valuable.

Let’s be clear here, though – word count is not the be-all, end-all factor for content. It’s about content quality, depth, and breadth.

Back to our site – you discover that it’s full of thin content.


A comprehensive content strategy will be necessary in order to overcome this site’s content weaknesses.

Think About the Words Users Would Type to Find Your Pages

When performing keyword research, it is critical to ensure that you figure out how users search for your site. If you don’t know the words that users are searching for, then all of the keyword research in the world is for naught.

This is where effective keyword research comes into play.

When you do effective keyword research, you must consider things like your potential client’s intent when searching for a phrase.

For example, someone earlier in the buying funnel is more likely all about research. They would not be searching for the same keywords that someone who is at the end of the buying funnel would be (e.g., they are just about to buy).

In addition, you must also consider your potential client’s mindset – what are they thinking when they are searching for these keywords?

Once you have concluded the keyword research phase of your project, then you must perform on-page optimization. This on-page optimization process usually includes making sure that every page on your site mentions the targeted keyword phrase of that page.

You cannot do SEO without effective keyword research and targeting. SEO doesn’t work that way. Otherwise, you are not doing SEO.

Common issue

You run into a site that has nothing but branded keyword phrases and hasn’t done all that much to differentiate themselves in the marketplace.

Through your research, you find that they have not updated their blog all that much with a variety of keyword topics, and instead have only concentrated on branded posts.


The solution to this is quite simple – make sure that you use targeted keyword phrases that are of broader topical relevancy to come up with content, rather than branded keywords.

This goes back to the fundamentals of SEO, or SEO 101: include the keywords that your users would type to find those pages, and make sure your site includes those words on its pages.

This is, in fact, part of Google’s general guidelines for helping users understand your pages.

Design Your Site to Have a Clear Conceptual Page Hierarchy

What is a clear conceptual page hierarchy? This means that your site is organized by topical relevance.

You have the main topics of your site arranged as main topics, with subtopics arranged underneath the main topics. These are called SEO silos. SEO Silos are a great way to organize the pages on your site according to topics.

The deeper the clear conceptual page hierarchy, the better. This tells Google that your site is knowledgeable about the topic.

There are two schools of thought on this guideline – one believes that you should never stray from a flat architecture – meaning any page should not be more than three clicks deep from the homepage.

The other school of thought involves siloing – that you should create a clear conceptual page hierarchy that dives deep into the breadth and depth of your topic.

Ideally, you must create a website architecture that makes sense for your topic. SEO siloing helps accomplish this by making sure your site is as in-depth about your topic as possible.

SEO Siloing also presents a cohesive organization of topical pages and discussions. Because of this, and the fact that SEO siloing has been observed to create great results – it is my recommendation that most sites pursuing topically-dense subjects create a silo architecture appropriate to that topic.

Common Issue

You run into a site that has pages strewn all around, without much thought to the organization, linking, or other website architecture. These pages are also haphazardly put together, meaning that they do not have much of an organizational flow.


You can fix this issue by creating a siloed website architecture that conforms to what your competitors are doing. The thought behind doing this being that this website architecture will help reinforce your topical focus, and in turn, improve your rankings through entity relationships between your pages and topical reinforcement.

This topical reinforcement then creates greater relevance for your keyword phrases.

Ensure All Website Assets Are Fully Crawlable & Indexable

You must be thinking – why shouldn’t all assets be fully crawl able and index able?

Well, there are some situations where blocking CSS (Cascading Style sheets) and JS (JavaScript) files are acceptable.

  • First, if you were blocking them because they had issues playing nice with each other on the server.
  • Second, if you were blocking them because of some other conflict, either way, Google does have guidelines for this also.

Google’s guidelines on this topic state:

“To help Google fully understand your site’s contents, allow all site assets that would significantly affect page rendering to be crawled: for example, CSS and JavaScript files that affect the understanding of the pages. The Google indexing system renders a web page as the user would see it, including images, CSS, and JavaScript files. To see which page assets that Googlebot cannot crawl, use the URL Inspection tool, to debug directives in your robots.txt file, use the robots.txt Tester tool.”

This is important. You don’t want to block CSS and JavaScript.

All elements are critical to ensure that Google fully understands the context of your page.

Most site owners block CSS and JavaScript via robots.txt. Sometimes, this is because of conflicts with other site files. Other times, they present more problems than not when they are fully rendered.

If site files present issues when they are rendered, it is time to create a fully revamped website.

Common Issue

You come across a site that has CSS and JavaScript blocked within robots.txt.


Unblock CSS and JavaScript in robots.txt. And, if they are presenting that much of conflict (and a mess in general), you want to have as clean of an online presence as possible.

Make Your Site’s Important Content Visible by Default

Google’s guidelines talk about making sure that your site’s most important content is visible by default. This means that you don’t want buttons, tabs, and other navigational elements to be necessary to reveal this content.

Google also explains that they “consider this content less accessible to users, and believe that you should make your most important information visible in the default page view.”

Tabbed content – yes, this falls under content that is less accessible to users.


Consider this: you have a tabbed block of content on your page. The first tab is the only one that is fully viewable and visible to users until you click on the tab at the top to go to the second one. And so on.

Imagine Google – they think this kind of content is less accessible.

While this may be a fairly small consideration, you don’t want to do this egregiously – especially on your homepage.

Make sure that all tabbed content is fully visible.

Common Issue

You get a website assigned to you that has tabbed content. What do you do?


Recommend that the client create a version of the content that is fully visible.

For example, turn tabbed content into paragraphed content going up and down the page.

Google’s Webmaster Guidelines Are Really Guidelines

As SEOs, Google’s Webmaster Guidelines are just that, guidelines. It can be said that they’re just “guidelines” and not necessarily a rule.

But, watch out – if you violate them egregiously, you could be outright banned from the SERPs. I prefer remaining on Google’s good side. Manual actions are ugly.

Don’t say we didn’t warn you.

Of course, penalties can range from algorithmic devaluations to outright manual actions. It all depends on the severity of the violation of that guideline.

And, pages and folders can be devalued when it comes to Penguin issues. Don’t forget that real-time Penguin is inherently very granular in this regard.

But, it’s important to note – not all guideline violations will result in penalties. Some result in issues with crawling and indexing, which can also impact your ranking. Others result in major manual actions, such as spammy links back to your site.

When a manual action hits, it’s important to remain calm. You have likely brought this on yourself through link spam or another type of spam on your site.

The best you can do now is investigating and work with Google to remove the manual action.

In general, if you have spent a lot of time getting into trouble, Google expects the same amount of time to get out of trouble before you get back in their good graces.

Other times, the site’s so terrible, that the only solution is to nuke it and start over.

Armed with this knowledge, you should be able to identify whether certain techniques have caused you to get into some serious trouble.

As an aside, it is definitely worth it following Google’s guidelines from the start.

While results are slower when compared to other, more aggressive methods, we highly recommend this approach. This approach will help you maintain a more stable online presence, and you don’t have to suffer through a manual action or algorithmic devaluation.

The choice is yours.

Leave a Reply

Your email address will not be published. Required fields are marked *