Google’s Webmaster Guidelines As An SEO Guide
If you want Google to be able to locate, index, and rank your site properly, stick to Google’s general SEO guidelines described below. You can carry out the following search engine optimization tactics yourself, or you can hire Faulkner Marketing SEO to perform these services for your business.
If you want to get ranked accurately and avoid penalties, we recommend sticking very closely to the summary of Google’s Quality Guidelines you’ll find here. Engaging in deceptive practices that go against the guidelines could lead to manual or automated penalties up to and including total removal from Google’s index. Sites that are judged to be engaging in spam action may be dropped entirely from the search results at google.com as well as the company’s other sites.
Locating Your Pages
Every page on your site should be reachable from a link on another page which Google’s bots can find. Links need to include text that is relevant to the page they point to. In the case of image links, you should use an alt attribute to provide relevant text.
Build a sitemap that’s linked to all of your site’s important pages. You should provide this in the form of both an internal sitemap file and a visitor-viewable page (e.g. a site index page).
Do not overload any one page with an excessive number of links. More than a few thousand is problematic.
Your server should be configured to support the If-Modified-Since HTTP header. These headers give Google’s bots a clear indicator of whether or not your site has changed since it was last crawled. If-Modified-Since headers save you both overhead and bandwidth.
You can exercise fine control over the way your site is crawled using your server’s robots.txt file. Exclude potentially-infinite pages like search results. Make sure your robots.txt file is always up to date. There are tools available to test your robot file to ensure its syntax is correct and its coverage is complete.
Making Your Site Easier To Find
- Use Google Search Console to submit a sitemap. This tells Google about your site’s structure and helps to ensure that your pages are fully crawled.
- Notify sites that need to be aware of your pages that they are online.
- Make Your Pages Easier To Understand
- Your site should be useful and full of information. Your pages should be accurately described by the HTML information and metadata that go with them.
- Use logical keywords – those phrases and words that Google users might enter into the search engine when looking for your site – on pages where they are relevant.
- Make titles, headers, and alt attributes as accurate, specific, and descriptive as possible.
- Build your site with a consistent hierarchy of pages.
- Follow Google’s recommendations for incorporating images, videos, and structured data into your site.
- If you’re using a content management system (e.g. WordPress), verify that it’s creating pages and links that are crawlable.
SEO: Best Practices
Don’t track search bots using session IDs or URL parameters when they are crawling your site. These analytic tools deliver useful information when they’re applied to human users, but bots operate in totally different ways. Bots that are tracked with these techniques may have trouble parsing duplicate URLs (i.e. multiple addresses pointing to the same page). This can lead to the bots missing parts of your site.
All of the important content on your site should be visible by default. While Google still crawls the hidden content contained in the HTML for tabs and expanding sections, it’s assessed an accessibility penalty. In the search engine’s eyes, the information which is visible in your default page view takes priority.
You should take measures to prevent your site’s ad links from having any effect on your search engine rankings. You can, for instance, stop crawlers from following ad links by either adding rel=”nofollow” to them individually or listing them in your robots.txt file.
Avoid placing important information – names, content, links, etc. – in images when they can be conveyed through text. If you find it unavoidable to include text in images, make sure they have alt attributes which include robust descriptions.
Use valid HTML throughout your site and verify that all of your links are working.
Do everything you can to keep your page loading times to a minimum. Faster loading is an anti-frustration feature for visitors. It’s worth remembering that not all internet connections are created equal; those with slower connections will appreciate optimized loading times. You can test your pages’ performance with tools such as webpagetest.org or PageSpeed Insights.
Your site should be fully usable for visitors regardless of their device. The amount of traffic now coming from mobile devices (e.g. smartphones and tablets) as well as desktops is too large to be ignored. Fortunately, you can use a variety of tools to evaluate your site’s performance on mobile devices. Make fixes where necessary and improvements where possible.
Check your site’s compatibility with multiple browsers.
Use a secure connection (HTTPS) for your site whenever possible. Encrypted web communication is fast becoming a standard practice, especially where users are expected to submit information of their own.
Run your site through a screen reader to test its usability for visually impaired visitors.
Google’s Guidelines For Website & SEO Quality
The following general guidelines describe some of the common types of manipulative and deceptive behavior encountered in SEO. They are not exhaustive; there may be other forms of deceptive behavior that will draw Google’s wrath. Do not assume that a given technique is safe (or approved of by Google) simply because it’s not listed here. “Playing it safe” by adhering to the spirit of the basic principles listed below will lead to a better user experience. This translates into stronger rankings than those received by sites that are constantly focused on gaming the system.
You can file a spam report with Google if you have reason to believe that another site is abusing the search engine’s quality guidelines. Google doesn’t like to engage in case-by-case spam-busting; it prefers to develop automated solutions which can be applied broadly. Direct action in response to every report is not guaranteed, but reports of issues that severely degrade user experience can result in penalties up to and including complete de-indexing of the offending site. Note this is the atom bomb of spam report responses; there are many other ways Google could take action which are much more subtle.
- Users should be the primary audience for pages, not bots.
- Users must not be deceived.
- Don’t play tricks in an effort to earn higher rankings. As a basic ethics check, consider whether or not you’d feel comfortable describing your actions to a Google employee or a direct competitor. Another good hypothetical to consider is whether or not you would take a given action if there weren’t such things as search engines.
- Users value uniqueness. Concentrate on the distinguishing features that separate your site from others.