- 34 successful tests
- 1 error
- 7 warnings
Create a project and optimize the website
Try the full version of PR-CY: find errors on internal pages and fix them with the help of the service's tips.
- Overview of your overall SEO health
- Comparison with competitors
- Daily rank tracking
- SSL certificate will expire in 24 days days01/10/2023
- SSL certificate renewed.01/02/2023
It is impossible to know how many pages Google correctly indexed on the site. The search engine does not maintain a database by URLs.
The approximate number of pages in the search results will be shown by the site operator, which we rely on. The number can be distorted by pages that are banned from indexing in robots.txt, but hit the search results because of external links to them.
The “Indexing Status” section in Google Search Console will show a slightly more correct number, but this data can also be distorted by the filters’ application.
Google scans websites to find infected resources, phishing pages, and other issues that degrade the search results and user experience. With this information, the search engine warns users about unsafe sites. If a site is deemed dangerous, Google might downgrade or remove it.
Search engine systems
Statistics systems on the site take into account traffic, refusals, viewing depth and many other indicators. They help to track the effectiveness of promotion and advertising campaigns.
We found 1 site with same tag.
The test shows active and previously unplugged counters of statistics systems and related sites. This information can be useful if a competitor has projects unknown to you, the statistics of which are managed from one account so you can find them.
If something goes wrong in your counters, you can also find it out.
Sometimes Web studios independently install counters on clients' sites and manage them from the same account as the counter of their site. You can analyze the studio's website and, thanks to this test, find out who its clients are.
Don't forget to prolong your domain name. It’s better to involve auto-renewal with your registrar. After the end of the domain registration, there is a chance to lose access to the domain.
For the promotion of commercial sites, the confidentiality of the exchange of information between the server and visitors is important. This increases the loyalty of potential customers to the resource, increases the level of trust, affects the conversion and growth of positions in the search results for almost all requests.
Technically, domains with www and without www are two different resources; search engines index and rank them separately, and links will have different weights. This can lead to:
- Decrease in search results
- Filter, because a search engine may mistake one site for a duplicate of another;
The problem is solved by a 301 redirect and an indication of the main mirror to the search engines. From the website promotion point of view, a domain without www is better, because it is not a third-level domain, and its length will always be shorter.
Due to incorrect encoding, site content may be displayed incorrectly. In addition to the fact that visitors will not like it, the site will not be indexed or will get under the search engines filter. We recommend using UTF-8 encoding to display the text on the site pages correctly. In some CMS, for example, Wordpress, files are written in this encoding, AJAX also supports only UTF-8.
Don't forget to include the encoding in your meta tags: <meta charset="UTF-8" />
A robots.txt file is a list of restrictions for search robots or bots that visit a site and crawl information there. Before crawling and indexing your site, all robots go to the robots.txt file and look for rules.
The robots.txt file is located in the root catalogue of the site. It must be available on the URL: pr-cy.io/robots.txt
There are several reasons to use a robots.txt file on your site:
- to remove duplicate content;
- to hide unwanted information;
- to limit the indexing speed.
At least one sitemap has been found and is available.
A Sitemap file is a file with information about the pages on the site to be indexed. With this file you can:
- tell search engines which pages of your site need to be indexed;
- find out how often the information on the pages is updated;
- understand which pages are most important to index.
- http://winitpro.ru301 Moved Permanently
- https://winitpro.ru/200 OK
- Successful resource request.
For successful indexing of the page by search bots, the HTTP response code of the server must be 200.
- [Checking the server response of the internal pages of the site](/tools/http-response/
- List of status codes
Download speed directly affects behavioral factors: the faster the download, the fewer bounces. The Google robot visits slow sites less often. This affects the effectiveness of promotion, such sites are rarely indexed. Loading speed is one of the main factors in Google ranking.
We found 4 sites with the same IP address.
IP addresses ever found on the site. And also the sites that have the same IP address.
Found 9 errors and 24 of warnings.
Error free code is the code that corresponds to the W3C standards. Pages with correct code are displayed correctly in the browser, which means, they have good behavioral factors, and take higher positions in the search results.
- W3C Service - checking pages for code errors.
While requesting a non-existent page, the server should return a 404 error which is"page not found."
If the server is set up incorrectly and a 200 error returns, then the page exists. In this case, search engines may index all pages of the site with errors.
Set up the site in such a way that while requesting non-existent pages a 404 response code "page not found" or a response code 410 "page deleted" appear.
During requesting a non-existent page, the server displays a standard page with a 404 error. For the users’ convenience we recommend making a unique 404 page and adding a backlink to the site on it.
Sign up to see the full report.Sign up
Website scan not started.
Audit of website pages:
- errors of pages responses;
- see broken links;
- problems with meta-tags.
Windows для системных администраторов | WinITPro.ru
Length: 51 characters. Found: 1 tag.
A webpage title is a headline of the webpage in search results. A title is one of the main indicators of relevance considered by search engines. The title must contain keywords but may not contain a website name as the crawler (search robot) already knows the domain name.
Заметки системного администратора. Рабочие практические инструкции по Windows Server, VMWare, Linux. Советы и множество руководств по администрированию. Заметки
Length: 160 characters. Found: 1 tag.
Description is a tag that is used to describe a page for a search engine crawler, users do not see it. It should describe the content of the page correctly, because search engines often use the text from the Description to compose a snippet. It is better to place the keywords to the top, the text should not be repeated in other parts of the page. In the page code, the tag is placed between <head> and </head>. The optimal text size in the Description tag is 150-300 symbols for Google and up to 160 symbols for Yandex. It does not influence on SEO directly, while CTR depends on a good description.
The H1-H6 headers are responsible for the structure of the page's content. You need to highlight them in the layout to help the reader navigate the text. Headings are important for search engine promotion because search engines use them to determine what is on the page and how relevant it is. Arrange headers according to hierarchy and do not link to them.
The relevance of the content is more important than the length of the text for search engines. Choose the amount of text depending on the topic and purpose, focus on the competitors’ materials. The optimal text length is 1000-2000 words for two or three promoting keywords / phrases.
For search engines, content relevance is more important than text length. Choose the amount of text depending on the topic and purpose, focus on the materials of competitors. The optimal text length is 1000-2000 words for two or three promoted keywords/phrases.
Keyword density is one of the qualitative indicators of text, it shows the frequency of repetition of words in a document. "Keyword stuffing" is equal to the proportion of repeated words to the entire volume of the text.
A high level of Keyword density is considered to be 8%. Such texts are often hard to read, they are spammed. The pages they are hosted on have a high percentage of failures. A site with a lot of texts with a high Keyword density may receive sanctions from search engines.
Normal Keyword density is 4-6%. Almost all classical literature has this level of Keyword density.
The optimal page size is considered to be up to 100 KB after compression. Delete unnecessary elements and use gzip compression to reduce the size.
External links are the links from you to another site. Try not to refer to resources with incorrect information that do not relate to your topic, choose useful and authoritative ones. Do not put too many external outgoing links and do not post them on your homepage. Selling the links negatively affects promotion.
With the help of internal links, you can influence the weight redistribution between individual pages of the resource, referring to more significant sections or articles. This weight redistribution is called linking and is used as part of internal site optimization.
Internal links influence behavioral factors - they simplify navigation and help the user to get to the necessary section faster.
The service searches the words on sites that can be classified as pornography. Search engines are struggling with showing 18+ content, so they remove sites with pornographic materials from the ranking for other requests.
Even if you have not posted such materials on the site, they may appear as a result of a hacked site or in the comments.
What to do:
- do not use words, images and videos of a pornographic nature;
- check ads on your site, switch off showing 18+ ads if they are shown by default;
- check users’ reviews and comments;
- if you link to another site, make sure that there is no adult content and links to such sites.
Open Graph was developed by Facebook specialists to make links to sites within the social network displayed nicely and be informative. Now Open Graph is supported by many social networks: Facebook, Twitter and messengers, for example, Telegram and Skype.
Why to use Open Graph?
- for user to see relevant text and image on the link preview
- to improve the behavioral factors of the site - a properly designed link makes more conversions
- to make link snippet look like an independent post on your personal page or in the community - you do not have to add a description and pictures.
You have to insert the Open Graph meta tags into the page code of the tag to get a beautiful site snippet.
Micro-markup is the semantic markup of website pages that structures data. It is based on injecting special attributes into the HTML code of the document.
Schema.org is a globally recognized standard that is recognized by the most popular search engines such as Google, Yandex, Yahoo and Bing.
Pros of micro-markup:
- The logical structure of the information on the page helps search engines retrieve and process data.
- Enhanced snippets on the search results page improves click-through rates.
To highlight your site, use the Favicon, a special image format that appears next to your site's address in the search engine and in the address bar.
To make browsers show your site’s icon, put it into your site’s root folder. You can assign different icons to individual pages.
Search engines eagerly index tweets. Links in tweets are also indexed, including indirect ones (for example, through goo.gl services, etc.). At the same time, Twitter is indexed by fast robots. To have an impact on website promotion in Google a link from Twitter must be indexed by search engines.
Twitter helps to promote the site and speeds up its indexing.
Search engines index links from Facebook. The most useful are the likes of the company's social page itself. The more likes the Facebook page of the promoted site gets, the more links from the pages of social network users the site will receive. The most valuable are the likes from authoritative accounts and popular pages. Sharing your Facebook page is similar in effect to backlinks.
The system does not detect groups or profiles from Facebook. It is important to mention a link to a business page on a social network. A link to your site should also be on the social network page.
Telegram has all the features of a social network to promote companies. It allows you to create channels analogous to groups and communities, share content on behalf of the company, communicate with brand representatives, develop the channel, and attract more subscribers.
The service searches for Telegram channels and groups links on a website. It will find them if the links are listed correctly - for example, https://t.me/prcynews. If you don't see your Telegram channel in the results, then the link may be broken.
Google PageSpeed Insights
The service measures the speed of the site and analyzes the stages of the loading process. Stages correspond to LCP, CLS, FID and others that are included in the Core Web Vitals from Google: content rendering, response time to the first user action, layout displacement due to loading elements. By optimizing these steps, you can speed up loading times and make your site more user-friendly.
Google is moving sites to mobile-first indexing, that is, it will primarily focus on the mobile display of the site. Download speed should meet standards on any device. The service analyzes the loading stages and compares the indicators with the Core Web Vitals parameter: the webmaster will be able to work on each stage and, due to this, improve the speed in general.
Plug-ins help the browser process the special content, such as Flash, Silverlight or Java. Most mobile devices do not support plug-ins, which leads to many errors and security violations in browsers that provide this support. In this regard, many browsers restrict the work.
Description for the viewport. This means that mobile devices will try to display them as on a PC, downscaling in proportion to the size of the screen. Specify the viewport tag so your site displays correctly on all devices.
The viewport determines how a web page is displayed on a mobile device. If not specified, the page width is assumed to be the PC standard and is reduced to fit on the screen. Thanks to the viewport, you can control the page width and its scaling on different devices.
Many web servers can before sending to compress files in the GZIP format, using your own procedure or third-party modules. This allows faster loading of the resources needed to display the website.
Compressing resources with gzip or deflate function allows you to reduce the amount of data transferred over the network.
Try to reduce the size of images to a minimum, it will speed up the loading of resources. The correct format and compression of images can reduce their size. Perform basic and advanced optimization on all images. As part of the basic optimization, trim unnecessary margins, reduce the color depth to the minimum acceptable value, remove comments and save images in an appropriate format. Basic optimization can be done using any image editing software.
Users of PCs and mobile devices are used to perform vertical and not horizontal scrolling websites. If to view just the content you need to scroll the website horizontally or zoom out, it causes any inconvenience.
When developing the mobile site with a meta viewport tag, you will have to position the content so that it will not fit into the specified viewport. For example, if the image is wider than the viewport, there may be a need for horizontal scrolling. To avoid this, you need to change the content so that it entirely fits.
Website design for mobile phones solves two problems: it provides users with a comfortable viewing of the site from any device and has a positive effect on the search ranking of the site.
Check that your site displays correctly on mobile devices.
'One of the most common problems of reading of sites on mobile devices is too small font size. Have constant to scale the website to read small text, and it is very annoying. Even if the site has a mobile version or adaptive design, the problem of poor readability due to small font are not uncommon.
Use legible font sizes to make your site more convenient.
Due to the cache users re visiting your website, spend less time on loading pages. Caching headers should apply to all cached static resources.