How to use the robots file to enhance the weight of the main page


Some navigation page


so, once the 404 page too much, but also to let the search engine included, will lead to the site of the weight of 404 pages, 404 pages should be shielded so.

add rules: Disallow:/404.html

of each site has some error page 404 page exists for when users access the error page to guide the user to the right page, to prevent the loss of traffic. A web page is more sure there are many error pages, which leads to many similar 404 pages, such as 贵族宝贝daochengrc贵族宝贝/404.html, 贵族宝贝yongjiangrc贵族宝贝/404.html,

site of the robots file inside the grasp rules has been increasingly improved, such as the prohibition of the web page image grab, grab some banned spider member privacy (resume) page, some useless page (before promotion page), a CSS file, but there are still a part of the page does not need the spider crawling, this part the page is user oriented, search engines and do not have what meaning, divide up the weight of the page.


at the bottom of the site, such as "market cooperation" and "website statement" and "payment method" is based on the customer’s page, almost no user through the search engine search on these pages and arrived at the site, and the navigation page is displayed by the station, some of the content page, the weight of the same thing the dispersion of these pages.

add rules:



part page is in the same directory under the /main directory, in addition to retain part of the page to the spider crawl, other pages can be banned, reserved page: "about us" (main/aboutus.asp), Links (main/friendlink.asp). In addition to "standard tariff" payment "page at the enterprise member center page, these pages is not necessary to open the search engine.

as shown below:

two, part of the navigation website page

, a web page

The 404The


Robots file is the first one to read the file search engine when you visit a website, it tells the server what search procedures can crawl, which do not capture.