Major search engines provide SEO guidelines and best practices for webmasters to improve their website. Leading search engines with search market volume includes Google 91.94%, Bing 2.86%, Yahoo 1.5%, Baidu 1.37% and Yandex 1.1%.
In this article we have combined best practices recommended by major search engines. These guidelines help SEO experts to rank their websites higher. Follow these best practices and guidelines to stay on safe side and offer 100% white hat SEO to your website.
General SEO Guidelines
All search engines provide their guidelines for webmasters. These guidelines help search engines to crawl, index and rank your website.
All webmasters, designers, developers and SEO experts should read these guidelines carefully. Search engines announce some practices which are illicit as per their guidelines. Keeping these practices on your website may lead to different issues that includes
- Website ranking may affect by search engine algorithm.
- It may hit by manual spam action.
- Site being penalized or removed entirely from the search engine.
Search engines hate spam, if any website hit by a manual spam action, it will no longer appear in search results. So, pay very close attention to these guidelines.
Prioritize your goal
The goal of a website is very important. Remember, search engine ranking is not our goal. Either you want to sell a product or service on your website. We need people (real humans), who are our goal. We need to create web pages for our human fellows. Never create pages for search engines but for your target customers.
Make sure that your website stands out. Make it engaging, valuable and unique. Never try to deceive your users with any type of tricks to get search rankings.
How to help Search Engines to find your website?
First of all we need to tell search engines about our new website. There are couple of ways we can do this. Although I recommend to implement both of following methods
- Go to webmaster tools of that particular search engine and ask them to crawl your pages.
- Create your social media presence on websites like
Make sure you create proper pages on these social platforms for your website including the website address of your website.
How to help Search Engines to find your pages?
Every step is important in search engine optimization and we should complete each step properly.
Sitemap
Make sure that your CMS creates a valid sitemap. All search engines accept the XML (Extensible Markup Language) based sitemap. The sitemap includes information of all pages available on your website including the last modified information of your pages.
This is also recommended that you create a human readable sitemap page on your website. This will contain the list of all of your website pages with their links. It help users of your website to navigate on your website easily.
Orphaned Pages
All pages on your website should be linked with an html link. Any page on your website which is not reachable with a link from your website is called Orphaned page. Create anchor text that is relevant to the target page. If you are linking through an image, use proper alt attribute. The recommended practice is to use html tag with the href attribute.
Number of Links
User a reasonable number of links on a page. It depends on the length of the text being used on the page. Although it does not impact any negative signal to search engines but it impact on user experience. Visitors on your website should feel comfortable with the right text/link ratio.
Robots.txt
Make sure that you have a robots.txt file. When search engines crawl your website they first check your robots.txt file. Here you can set specific instructions for different search engines. You may instruct them which pages should be crawled and which not. For example as a best practice do not allow search engines to crawl unnecessary pages on your website like search page of your website, etc.
Keep robots.txt file updated on your website. You may check the coverage and system of your file with Google’s robots.txt Tester.
If-Modified-Since HTTP
While choosing your web server, validate if your web server supports If-Modified-Since HTTP header. This feature help search engines to understand if your web page have been updated or not. It can save crawling budget for search engines and can save overhead and bandwidth for your website as well.
Let Search Engines understand your pages
Content is King
Think about your visitors. What is the information do they need from you? Always try to create an information-rich and useful page that provide accurate and clear information about your web page content.
Key phrases
What users would type to search for your pages? You should use the relevant words (key phrases) within the content of your web page.
Title Element
Accuracy is very important. While using title elements on your page, ensure that title elements are accurate, specific and descriptive.
Page Hierarchy
Website hierarchy is important as users may keep on browsing your website and dig deeper in your website with a conceptual and clear page hierarchy.
Images
Images help to achieve great user experience on our website.
Use following best practices for images on your website.
- Relevant Images
Use images close to the relevant topic. Make sure that the image make sense and it help users to understand your topic better. - Important information inside images
Never try to use your important title tags or content inside your images. For example page headings, etc. - Optimize Images
High quality images are very important for users. Sharp images make more sense rather blurry or pixilated images. Make sure the image size is in KB’s rather than in MB’s as heavy images lead to page load time. - Image Names
Always carefully name your images. Like if the picture you have is a laptop. You should name it “acer-laptop.jpg” rather than “IMG000233.jpg”. Names should be accurate and relevant. - Image Alt Tags
Try to be more descriptive about your images. Like in previous example we have a laptop image. We can use alt tag like “Acer Laptop” or “Silver Acer Laptop” or “Silver Acer Laptop placed on a table”. - Responsive Images
Make sure images look good on all devices and all screen sizes. Use proper responsive image techniques to provide best user experience on all devices. - Structured Data
Remember to use proper structured data for your images. It will help search engines to understand your images much better. - Image Sitemap
Always use a specific images sitemap for all of your images to be indexed.
Video
Videos make things much easier for users and create better user experiences. Search engines have specific guidelines for videos on your web pages as well.
- Public Page
Your video should be placed on a public page. This page should be accessible by search engines for your video to be indexed. - Dedicated Page
Ensure that you have a specific page for each of your video. This page should have prominent topic of your video. - Third Party Video
You may use videos from third party platforms like Youtube, Vimeo, Facebook, etc. Search engines may index these video from your website as well as from third party platforms. - Structured Data
Always use proper structure data for your videos. - Video Sitemap
A specified video sitemap is very important for all the videos to be indexed properly.
Structured Data
Search bots work hard to understand the data on your pages. Structured data help search engines to understand the content of your page with specific standards. Structured data is usually placed in a <script> tag in head or body section of the page.
Crawl-able Pages
Make sure all the pages you are creating are by default crawl-able by search engines. Especially while you are using a content management system (CMS) like wordpress, wix, shopify, etc.
Site Assets
There are many assets on your website which impact to understand the pages. These assets includes images, videos, CSS and JavaScript files. Search bots wants to understand that how your pages are going to display for users. Google have a URL Inspection tool where you can see that how Google spider is reading your page.
URL Parameters
You should make sure that search bots are accessing your page URLs without any additional parameters. This may count as duplicate URLs and search bots may not be able to eliminate these additional parameters.
Visible Content
The recommended practice is that you are not using any of your important content inside expanding sections or tabs which are usually managed by JavaScript or jQuery. Search bots consider these content as less accessible to users. So, never use important content in these type of elements.
Sponsored Advertisement
In-case you are using paid advertisement, these advertisement links can cause issue for SEO efforts. It is recommended that you spend some time to make sure that these links are not being indexed. You may use rel=”sponsored” or rel=”nofollow” or robots.txt file to prevent these links.
Think of your website visitors
- Valid HTML
When we talk about user experience. Make sure that there are no website errors. All pages should have a valid HTML. Use W3CValidate to check for HTML validity. Apart from SEO guidelines, its a very good practice to keep your website clean. - Website Speed
Everyone like fast websites. An improved website speed will make users happy and this will ultimately improve overall quality of the website. You may check website speed with Webpagetest or Google’s PageSpeed. - Responsive Websites
Smartphones have started gaining a lot of search. Users love to search from their mobile devices. Make sure that you website provide best user experience on all device sizes and all device types including smartphones, tables and desktops. - Browser Friendly
There are a lot of browsers including Chrome, Firefox, Internet explorer, Safari, etc. Ensure that your website performs well in all these browsers. - HTTPS
User proper HTTPS encryption for your website. It is a good practice and create better user experience.
Good practices as per SEO guidelines
- Monitor Website
Keep an open eye on your website. Monitor for any type of hacking attacks and remove any auto generated content as soon as possible. - User Generated Spam
Make sure that no user generated spam remains on your website. Prevent and remove this user generated spam as soon as possible.
Things to Avoid
- Clocking
Showing page in different ways to search engines and different to users. For example showing a page with images to users while showing same page to search engines in text format only. - Doorway pages
All pages created to rank for similar search queries, they can lead to multiple similar pages in search results. Stay away from them. - Scraped content
Content taken from other websites (scraped) to increase more content on your website without any added value or original content. - Sneaky redirects
Although page redirects can be useful in many cases. But to use these redirects to display different pages to users and different to search engines is a bad signal. - Hidden text/links
For example using black text on black background. It will be invisible to your users but search engines will read that content as text. Hidden text/links are a violation of SEO guidelines, which may lead to penalize your website. - Irrelevant keywords
The most important example is “keyword stuffing”. Trying to add keywords which does not fit in the content of the page. - Participating in link schemes
Buying links for your website or selling links from your website, etc. Any type of link scheme participation is against SEO guidelines and may risk your website ranking. - Abusing structured data markup
- Automatically generated content
- Sending automated queries to search engines
- Affiliate Programs: without adding sufficient value
- Creating pages with very little or no original content
- Malicious Behavior: Installing Trojans, viruses and other badware
Conclusion
I am a small business freelance SEO consultant working on 100% white hat SEO. You will find all above mentioned SEO guidelines being practiced in my SEO packages. Let me know in comments below about your personal experiences and best practices.