
In July of this year, Bing updated the Webmaster Guide – the most recent changes were a few years ago. The guide is a set of guidelines designed to explain to users how BING finds, indexes and evaluates websites. By following these guidelines, you can optimize your site to rank for relevant queries from the Bing search engine.
In this article, we will analyze how to conquer the top of the search results in BING, and what you should not do in order not to run into spam filters and say goodbye to the top once and for all.
How Bing finds pages on your site
-
Site `s map. A sitemap is an essential component that allows a search engine to discover URLs and content. It also tells the crawler which pages you think are important to the site. Bing strongly recommends using a sitemap file.
Make your sitemap accessible to Bing:
-
Upload the map to a Bing webmaster using the Bing Webmaster Tools;
-
Specify the path in your robots.txt file. For example –
Sitemap: http://example.com/sitemap_location.xml.
When it downloads, Bing will scan it regularly. We recommend that you resubmit it only when the site has changed significantly.
-
-
Use Bing URL or Content Submission APIto instantly index your site’s updated content. If the content fails to load, Bing recommends submitting the updated URLs through Bing Webmaster Tools or adding them to your sitemap.
-
Links. Links are considered a parameter of a site’s popularity: the more links a site has, the more popular it is considered. For more sites to link to your site, you need to create high quality and unique content. The Bing crawler (Bingbot) examines all the links on your site to help discover new pages on your site.
- Crawled links are links of the tag, with the href attribute. The link must contain either text or an image of the alt attribute that is relevant to the page;
- Limit the number of links on the page – it is recommended not more than a few thousand links per page;
- Sponsored links and links leading to paid resources must have the attribute rel = “nofollow” or rel = “sponsored” or rel = “ugc”;
- Bing values naturally-growing links. That is, links leading to reliable and relevant sites that lead real users to the site.
Bing fights artificial links, which is the bought links that lead to bots. For the use of all sorts of unnatural overestimation of the number of incoming links. Doing so may result in your site being penalized and excluded from the Bing Index.
-
Limit the number of URLs: Bing recommends limiting the number of URL pages on your site to a reasonable number, but does not provide an estimate of the number of these pages. But the main point at this point is to avoid duplicate content.
To prevent duplicates, it is recommended to follow these rules:
- Avoid displaying different URLs with the same content using rel = “canonical”.
- Configuring website settings and Urls to improve crawl efficiency and reduce the number of variations of the same URL pointing to the same content.
- Avoid different URLs for the same page on the mobile and desktop versions of the site.
-
Use call forwarding. When moving content, for example from http to https, use the 301 redirect code. If the content is moved temporarily, ie less than one day, then use the 302 code. Bing asks you to use the redirect correctly and avoid the rel = “canonical” attribute when done moving content.
-
Let Bing crawl more. Webmaster’s Crawl Control lets you control how Bingbot crawls your site. In the settings, you can set when and at what speed your site will be crawled.
-
JavaScript. Bing can handle JavaScript in most cases, however there are limits to large-scale processing while minimizing the number of HTTP requests. Bing recommends using dynamic rendering.
-
Removing contentusing the 404 response of “non-existent content”. Speed up content removal with Bing’s Content Removal and page Removal tools.
-
Robots.txt… This file tells search engines which pages and files to crawl. It is recommended to block pages that should not be included in the search results, for example, the login page or the shopping cart.
-
Saving resources. Use GZIP + HTTP compression – the process of getting compressed content. This function increases the speed of loading the page.
How Bing evaluates content on site pages
Bing is looking for content that is clear, meaningful, and engaging, and is intended primarily for humans, not search engine crawlers. By creating quality content, Bing is more likely to index and show your content in search results.
-
Content. Sites that have little useful content, have affiliate links, and display ads rarely make it to the Bing search results, so the content is very important to the site. Therefore:
-
Update content for users regularly, create new ones.
-
There should be enough content.
-
Work on the uniqueness of your content.
-
Use unique videos and pictures that match the theme of the pages and optimize them.
-
SafeSearch. This option specifies whether to show or hide explicit images or videos. Bing is asking for help in understanding which content is adult content:
- Using ;
- Adding adult images and videos to a file share such as http://www.example.com/adult/image.png.
-
Make your content accessible. Avoid placing content inside Flash or JavaScript – they block search engine crawlers from searching for content.
- Make content accessible and easy to navigate.
-
-
HTML tags. Pay special attention to tags and what they contain. The content of the tags must be specific, accurate and relevant.
Bing recommends using the Bing SEO Analyzer or Markup Validation Service. An example of verification is shown below.
-
Microsoft Edge… Make sure your web page displays correctly in Microsoft Edge.
-
Additional content (CSS, JavaScript). Allow robots.txt to scan your styles and scripts. Use a reasonable amount of dynamic resource downloads such as AJAX to limit HTTP requests on large websites.
-
Use semantic markup. Bing recommends Schema.org markup in JSON-LD or Microdata format. You can also use RDFa or OpenGraph.
Use the SEO Reports section to monitor errors and recommendations.
How Bing ranks your content
Bing search results are generated using an algorithm that matches the search query a user enters into a search engine with the content in our index. Bing develops and continually improves its algorithms to provide the most complete, relevant and useful collection of search results available.
Below is a general overview of the main metrics Bing uses to rank content.
- Relevance. Shows how the content matches the purpose of the search query.
- Quality and reliability. The quality of a web page is determined by such factors as the reputation of the author or site, the level of discourse. Bing may also downgrade the rating if the content contains insults, negative statements.
- User engagement. Bing takes into account how users interact with search results.
- Freshness. Bing prefers more “fresh” content – if the page has relevant data, then it will rank higher.
- Location. Accounting for the user’s location.
- Page load time. The higher the speed, the better.
Bing Resent: Abuse and Examples of What to Avoid
Search engine optimization is essential to promote websites. Above, we discussed the parameters by which Bing ranks sites in search results. However, doing the above does not guarantee getting into the TOP, sometimes abuse of some of them can lead to fines from Bing.
Below we will analyze what the site can get a fine for.
- Disguise. This is the practice of showing one version of a site to a search robot, and a completely different one to the user. The content shown to users must match the content seen by the robot.
- Link schemes, buying links, sending link spam. Many schemes can increase the number of links leading to your site, but they fail to provide quality links, which will harm your site. Manipulating inbound links to artificially increase the number of links pointing to your website can result in your site being excluded from the Bing index.
- Social media schemes. The situation is the same as with link schemas.
- Duplicate content. Duplicate content across multiple URLs can cause Bing to stop trusting those URLs. It is recommended to use rel = “canonical” in such cases.
- Copied content. Copied content from other, more authoritative websites may be considered copyright infringement.
- Overflow of key phrases. An over-optimization filter is on the horizon.
- Automatically generated content. Actions where the content was generated by a robot without human intervention may result in a fine.
- Affiliate. Sites that do not provide the user with useful information in relation to another site.
- Malicious software. Check your content constantly for viruses.
- Misleading markup. There may also be penalties for mismatched markup and content.
Sites on which such actions are performed are considered to be of low quality. As a result, resources are subject to ranking penalties and cannot be selected for indexing.
Users can report abuse of any of these methods themselves using the feedback link as shown below.
Almost all of the above recommendations work the same for other search engines. Therefore, compliance with even one of the search engines is very important for website promotion as a whole.
To be sure that your site will be equally liked by search robots and users, use the Search Promotion service.