Try the full version of PR-CY: find errors on internal pages and fix them with the help of the service's tips.
Description:
It is impossible to know how many pages Google correctly indexed on the site. The search engine does not maintain a database by URLs.
The approximate number of pages in the search results will be shown by the site operator, which we rely on. The number can be distorted by pages that are banned from indexing in robots.txt, but hit the search results because of external links to them.
The “Indexing Status” section in Google Search Console will show a slightly more correct number, but this data can also be distorted by the filters’ application.
Updated 11/24/2020 17:13
Google scans websites to find infected resources, phishing pages, and other issues that degrade the search results and user experience. With this information, the search engine warns users about unsafe sites. If a site is deemed dangerous, Google might downgrade or remove it.
Additional Information:
Updated 11/24/2020 17:11
![]() | 0% |
![]() | 0% |
![]() | 0% |
We found 1 site with the same IP address.
IP addresses ever found on the site. And also the sites that have the same IP address.
Updated 11/24/2020 17:12
For the promotion of commercial sites, the confidentiality of the exchange of information between the server and visitors is important. This increases the loyalty of potential customers to the resource, increases the level of trust, affects the conversion and growth of positions in the search results for almost all requests.
Updated 11/24/2020 17:12
Description:
Technically, domains with www and without www are two different resources; search engines index and rank them separately, and links will have different weights. This can lead to:
The problem is solved by a 301 redirect and an indication of the main mirror to the search engines. From the website promotion point of view, a domain without www is better, because it is not a third-level domain, and its length will always be shorter.
Updated 11/24/2020 17:12
Description:
A robots.txt file is a list of restrictions for search robots or bots that visit a site and crawl information there. Before crawling and indexing your site, all robots go to the robots.txt file and look for rules.
The robots.txt file is located in the root catalogue of the site. It must be available on the URL: pr-cy.io/robots.txt
There are several reasons to use a robots.txt file on your site:
Updated 11/24/2020 17:12
Description:
A Sitemap file is a file with information about the pages on the site to be indexed. With this file you can:
Additional Information:
Updated 11/24/2020 17:12
Successful resource request.
For successful indexing of the page by search bots, the HTTP response code of the server must be 200.
Additional Information:
Updated 11/24/2020 17:11
Download speed directly affects behavioral factors: the faster the download, the fewer bounces. The Google robot visits slow sites less often. This affects the effectiveness of promotion, such sites are rarely indexed. Loading speed is one of the main factors in Google ranking.
Updated 11/24/2020 17:11
Updated 11/24/2020 17:12
Found 54 errors and 71 of warnings.
Description:
Error free code is the code that corresponds to the W3C standards. Pages with correct code are displayed correctly in the browser, which means, they have good behavioral factors, and take higher positions in the search results.
Additional Information:
Updated 11/24/2020 17:12
Description:
While requesting a non-existent page, the server should return a 404 error which is"page not found."
If the server is set up incorrectly and a 200 error returns, then the page exists. In this case, search engines may index all pages of the site with errors.
Set up the site in such a way that while requesting non-existent pages a 404 response code "page not found" or a response code 410 "page deleted" appear.
Updated 11/24/2020 17:12
Description:
During requesting a non-existent page, the server displays a standard page with a 404 error. For the users’ convenience we recommend making a unique 404 page and adding a backlink to the site on it.
Updated 11/24/2020 17:12
Audit of website pages:
Description:
The optimal page size is considered to be up to 100 KB after compression. Delete unnecessary elements and use gzip compression to reduce the size.
Updated 11/24/2020 17:12
Description:
The service measures the speed of the site and analyzes the stages of the loading process. Stages correspond to LCP, CLS, FID and others that are included in the Core Web Vitals from Google: content rendering, response time to the first user action, layout displacement due to loading elements. By optimizing these steps, you can speed up loading times and make your site more user-friendly.
Updated 11/24/2020 17:12
Description:
Google is moving sites to mobile-first indexing, that is, it will primarily focus on the mobile display of the site. Download speed should meet standards on any device. The service analyzes the loading stages and compares the indicators with the Core Web Vitals parameter: the webmaster will be able to work on each stage and, due to this, improve the speed in general.
Updated 11/24/2020 17:12
We count all the elements of the page: images, videos, scripts, etc. For a page to load quickly, according to Google's recommendations, the total weight of its elements should not exceed 1600 KB. Optimize resource size: use text compression, minify HTML, JS, and CSS, use WebP instead of JPEG, enable data caching.
Description:
Plug-ins help the browser process the special content, such as Flash, Silverlight or Java. Most mobile devices do not support plug-ins, which leads to many errors and security violations in browsers that provide this support. In this regard, many browsers restrict the work.
Updated 11/24/2020 17:12
Description:
Description for the viewport. This means that mobile devices will try to display them as on a PC, downscaling in proportion to the size of the screen. Specify the viewport tag so your site displays correctly on all devices.
The viewport determines how a web page is displayed on a mobile device. If not specified, the page width is assumed to be the PC standard and is reduced to fit on the screen. Thanks to the viewport, you can control the page width and its scaling on different devices.
Updated 11/24/2020 17:12
Many web servers can before sending to compress files in the GZIP format, using your own procedure or third-party modules. This allows faster loading of the resources needed to display the website.
Compressing resources with gzip or deflate function allows you to reduce the amount of data transferred over the network.
Updated 11/24/2020 17:12
Try to reduce the size of images to a minimum, it will speed up the loading of resources. The correct format and compression of images can reduce their size. Perform basic and advanced optimization on all images. As part of the basic optimization, trim unnecessary margins, reduce the color depth to the minimum acceptable value, remove comments and save images in an appropriate format. Basic optimization can be done using any image editing software.
Users of PCs and mobile devices are used to perform vertical and not horizontal scrolling websites. If to view just the content you need to scroll the website horizontally or zoom out, it causes any inconvenience.
When developing the mobile site with a meta viewport tag, you will have to position the content so that it will not fit into the specified viewport. For example, if the image is wider than the viewport, there may be a need for horizontal scrolling. To avoid this, you need to change the content so that it entirely fits.
Updated 11/24/2020 17:12
Website design for mobile phones solves two problems: it provides users with a comfortable viewing of the site from any device and has a positive effect on the search ranking of the site.
Check that your site displays correctly on mobile devices.
Updated 11/24/2020 17:12
Updated 11/24/2020 17:12
'One of the most common problems of reading of sites on mobile devices is too small font size. Have constant to scale the website to read small text, and it is very annoying. Even if the site has a mobile version or adaptive design, the problem of poor readability due to small font are not uncommon.
Use legible font sizes to make your site more convenient.
Updated 11/24/2020 17:12
Due to the cache users re visiting your website, spend less time on loading pages. Caching headers should apply to all cached static resources.
Updated 11/24/2020 17:12
Description:
The resource size can be reduced by removing unnecessary page elements such as extra spaces, line breaks, and indentation. By minifying HTML, CSS, and JavaScript, you can speed up page loading, parsing, and rendering.
Updated 11/24/2020 17:12