Try the full version of PR-CY: find errors on internal pages and fix them with the help of the service's tips.
It is impossible to know how many pages Google correctly indexed on the site. The search engine does not maintain a database by URLs.
The approximate number of pages in the search results will be shown by the site operator, which we rely on. The number can be distorted by pages that are banned from indexing in robots.txt, but hit the search results because of external links to them.
The “Indexing Status” section in Google Search Console will show a slightly more correct number, but this data can also be distorted by the filters’ application.
Google scans websites to find infected resources, phishing pages, and other issues that degrade the search results and user experience. With this information, the search engine warns users about unsafe sites. If a site is deemed dangerous, Google might downgrade or remove it.
Search engine systems
Statistics systems on the site take into account traffic, refusals, viewing depth and many other indicators. They help to track the effectiveness of promotion and advertising campaigns.
For the promotion of commercial sites, the confidentiality of the exchange of information between the server and visitors is important. This increases the loyalty of potential customers to the resource, increases the level of trust, affects the conversion and growth of positions in the search results for almost all requests.
Technically, domains with www and without www are two different resources; search engines index and rank them separately, and links will have different weights. This can lead to:
The problem is solved by a 301 redirect and an indication of the main mirror to the search engines. From the website promotion point of view, a domain without www is better, because it is not a third-level domain, and its length will always be shorter.
Due to incorrect encoding, site content may be displayed incorrectly. In addition to the fact that visitors will not like it, the site will not be indexed or will get under the search engines filter. We recommend using UTF-8 encoding to display the text on the site pages correctly. In some CMS, for example, Wordpress, files are written in this encoding, AJAX also supports only UTF-8.
Don't forget to include the encoding in your meta tags: <meta charset="UTF-8" />
A robots.txt file is a list of restrictions for search robots or bots that visit a site and crawl information there. Before crawling and indexing your site, all robots go to the robots.txt file and look for rules.
The robots.txt file is located in the root catalogue of the site. It must be available on the URL: pr-cy.io/robots.txt
There are several reasons to use a robots.txt file on your site:
A Sitemap file is a file with information about the pages on the site to be indexed. With this file you can:
Successful resource request.
For successful indexing of the page by search bots, the HTTP response code of the server must be 200.
Download speed directly affects behavioral factors: the faster the download, the fewer bounces. The Google robot visits slow sites less often. This affects the effectiveness of promotion, such sites are rarely indexed. Loading speed is one of the main factors in Google ranking.
Found 189 errors and 6 of warnings.
Error free code is the code that corresponds to the W3C standards. Pages with correct code are displayed correctly in the browser, which means, they have good behavioral factors, and take higher positions in the search results.
While requesting a non-existent page, the server should return a 404 error which is"page not found."
If the server is set up incorrectly and a 200 error returns, then the page exists. In this case, search engines may index all pages of the site with errors.
Set up the site in such a way that while requesting non-existent pages a 404 response code "page not found" or a response code 410 "page deleted" appear.
During requesting a non-existent page, the server displays a standard page with a 404 error. For the users’ convenience we recommend making a unique 404 page and adding a backlink to the site on it.
Audit of website pages:
Мониторинг серверов Counter Strike 1.6 | Раскрутка cs 1.6
Length: 57 characters.
A webpage title is a headline of the webpage in search results. A title is one of the main indicators of relevance considered by search engines. The title must contain keywords but may not contain a website name as the crawler (search robot) already knows the domain name.
Мониторинг серверов Counter Strike 1.6 | Раскрутка cs 1.6
Length: 57 characters.
Description is a tag that is used to describe a page for a search engine crawler, users do not see it. It should describe the content of the page correctly, because search engines often use the text from the Description to compose a snippet. It is better to place the keywords to the top, the text should not be repeated in other parts of the page. In the page code, the tag is placed between <head> and </head>. The optimal text size in the Description tag is 150-300 symbols for Google and up to 160 symbols for Yandex. It does not influence on SEO directly, while CTR depends on a good description.
The H1-H6 headers are responsible for the structure of the page's content. You need to highlight them in the layout to help the reader navigate the text. Headings are important for search engine promotion because search engines use them to determine what is on the page and how relevant it is. Arrange headers according to hierarchy and do not link to them.
The relevance of the content is more important than the length of the text for search engines. Choose the amount of text depending on the topic and purpose, focus on the competitors’ materials. The optimal text length is 1000-2000 words for two or three promoting keywords / phrases.
For search engines, content relevance is more important than text length. Choose the amount of text depending on the topic and purpose, focus on the materials of competitors. The optimal text length is 1000-2000 words for two or three promoted keywords/phrases.
Keyword density is one of the qualitative indicators of text, it shows the frequency of repetition of words in a document. "Keyword stuffing" is equal to the proportion of repeated words to the entire volume of the text.
A high level of Keyword density is considered to be 8%. Such texts are often hard to read, they are spammed. The pages they are hosted on have a high percentage of failures. A site with a lot of texts with a high Keyword density may receive sanctions from search engines.
Normal Keyword density is 4-6%. Almost all classical literature has this level of Keyword density.
The optimal page size is considered to be up to 100 KB after compression. Delete unnecessary elements and use gzip compression to reduce the size.
63 links, of which are indexed
External links are the links from you to another site. Try not to refer to resources with incorrect information that do not relate to your topic, choose useful and authoritative ones. Do not put too many external outgoing links and do not post them on your homepage. Selling the links negatively affects promotion.
101 links, of which are indexed
With the help of internal links, you can influence the weight redistribution between individual pages of the resource, referring to more significant sections or articles. This weight redistribution is called linking and is used as part of internal site optimization.
Internal links influence behavioral factors - they simplify navigation and help the user to get to the necessary section faster.
Open Graph was developed by Facebook specialists to make links to sites within the social network displayed nicely and be informative. Now Open Graph is supported by many social networks: Facebook, Twitter and messengers, for example, Telegram and Skype.
Why to use Open Graph?
You have to insert the Open Graph meta tags into the page code of the tag to get a beautiful site snippet.
Micro-markup is the semantic markup of website pages that structures data. It is based on injecting special attributes into the HTML code of the document.
Schema.org is a globally recognized standard that is recognized by the most popular search engines such as Google, Yandex, Yahoo and Bing.
Pros of micro-markup:
To highlight your site, use the Favicon, a special image format that appears next to your site's address in the search engine and in the address bar.
To make browsers show your site’s icon, put it into your site’s root folder. You can assign different icons to individual pages.
Search engines eagerly index tweets. Links in tweets are also indexed, including indirect ones (for example, through goo.gl services, etc.). At the same time, Twitter is indexed by fast robots. To have an impact on website promotion in Google a link from Twitter must be indexed by search engines.
Twitter helps to promote the site and speeds up its indexing.
Search engines index links from Facebook. The most useful are the likes of the company's social page itself. The more likes the Facebook page of the promoted site gets, the more links from the pages of social network users the site will receive. The most valuable are the likes from authoritative accounts and popular pages. Sharing your Facebook page is similar in effect to backlinks.
The system does not detect groups or profiles from Facebook. It is important to mention a link to a business page on a social network. A link to your site should also be on the social network page.
Plug-ins help the browser process the special content, such as Flash, Silverlight or Java. Most mobile devices do not support plug-ins, which leads to many errors and security violations in browsers that provide this support. In this regard, many browsers restrict the work.
Description for the viewport. This means that mobile devices will try to display them as on a PC, downscaling in proportion to the size of the screen. Specify the viewport tag so your site displays correctly on all devices.
The viewport determines how a web page is displayed on a mobile device. If not specified, the page width is assumed to be the PC standard and is reduced to fit on the screen. Thanks to the viewport, you can control the page width and its scaling on different devices.
Many web servers can before sending to compress files in the GZIP format, using your own procedure or third-party modules. This allows faster loading of the resources needed to display the website.
Compressing resources with gzip or deflate function allows you to reduce the amount of data transferred over the network.
Try to reduce the size of images to a minimum, it will speed up the loading of resources. The correct format and compression of images can reduce their size. Perform basic and advanced optimization on all images. As part of the basic optimization, trim unnecessary margins, reduce the color depth to the minimum acceptable value, remove comments and save images in an appropriate format. Basic optimization can be done using any image editing software.
Users of PCs and mobile devices are used to perform vertical and not horizontal scrolling websites. If to view just the content you need to scroll the website horizontally or zoom out, it causes any inconvenience.
When developing the mobile site with a meta viewport tag, you will have to position the content so that it will not fit into the specified viewport. For example, if the image is wider than the viewport, there may be a need for horizontal scrolling. To avoid this, you need to change the content so that it entirely fits.
Website design for mobile phones solves two problems: it provides users with a comfortable viewing of the site from any device and has a positive effect on the search ranking of the site.
Check that your site displays correctly on mobile devices.
'One of the most common problems of reading of sites on mobile devices is too small font size. Have constant to scale the website to read small text, and it is very annoying. Even if the site has a mobile version or adaptive design, the problem of poor readability due to small font are not uncommon.
Use legible font sizes to make your site more convenient.
Due to the cache users re visiting your website, spend less time on loading pages. Caching headers should apply to all cached static resources.