Try the full version of PR-CY: find errors on internal pages and fix them with the help of the service's tips.
Description:
It is impossible to know how many pages Google correctly indexed on the site. The search engine does not maintain a database by URLs.
The approximate number of pages in the search results will be shown by the site operator, which we rely on. The number can be distorted by pages that are banned from indexing in robots.txt, but hit the search results because of external links to them.
The “Indexing Status” section in Google Search Console will show a slightly more correct number, but this data can also be distorted by the filters’ application.
Updated 04/23/2017 12:03
![]() | 0% |
For the promotion of commercial sites, the confidentiality of the exchange of information between the server and visitors is important. This increases the loyalty of potential customers to the resource, increases the level of trust, affects the conversion and growth of positions in the search results for almost all requests.
Updated 04/23/2017 12:03
Description:
Technically, domains with www and without www are two different resources; search engines index and rank them separately, and links will have different weights. This can lead to:
The problem is solved by a 301 redirect and an indication of the main mirror to the search engines. From the website promotion point of view, a domain without www is better, because it is not a third-level domain, and its length will always be shorter.
Updated 04/23/2017 12:03
Description:
A robots.txt file is a list of restrictions for search robots or bots that visit a site and crawl information there. Before crawling and indexing your site, all robots go to the robots.txt file and look for rules.
The robots.txt file is located in the root catalogue of the site. It must be available on the URL: pr-cy.io/robots.txt
There are several reasons to use a robots.txt file on your site:
Updated 04/23/2017 12:03
At least one sitemap has been found and is available.
Description:
A Sitemap file is a file with information about the pages on the site to be indexed. With this file you can:
Additional Information:
Updated 04/23/2017 12:03
Download speed directly affects behavioral factors: the faster the download, the fewer bounces. The Google robot visits slow sites less often. This affects the effectiveness of promotion, such sites are rarely indexed. Loading speed is one of the main factors in Google ranking.
Updated 04/23/2017 12:03
Updated 04/23/2017 12:03
Found 103 errors and 2 of warnings.
Description:
Error free code is the code that corresponds to the W3C standards. Pages with correct code are displayed correctly in the browser, which means, they have good behavioral factors, and take higher positions in the search results.
Additional Information:
Updated 04/23/2017 12:03
Description:
While requesting a non-existent page, the server should return a 404 error which is"page not found."
If the server is set up incorrectly and a 200 error returns, then the page exists. In this case, search engines may index all pages of the site with errors.
Set up the site in such a way that while requesting non-existent pages a 404 response code "page not found" or a response code 410 "page deleted" appear.
Updated 04/23/2017 12:03
Description:
During requesting a non-existent page, the server displays a standard page with a 404 error. For the users’ convenience we recommend making a unique 404 page and adding a backlink to the site on it.
Updated 04/23/2017 12:03
Audit of website pages: