Data is out of date!
The information was collected on 5 years ago. Sign in to update.
Login and Update
Refresh as a guest

Updated 5 years ago

Domain score

  • 24 successful tests
  • 3 errors
  • 7 warnings

Create a project and optimize the website

Try the full version of PR-CY: find errors on internal pages and fix them with the help of the service's tips.

  • Overview of your overall SEO health
  • Comparison with competitors
  • Daily rank tracking
Get startedMore info

Latest events

  • WWW-redirect for the domain started working.
  • Sitemap found.

Site-level parameters

Google Indexed
History data is found for this test!
Changes found in history for 5 months. First date: Mar 2014.


It is impossible to know how many pages Google correctly indexed on the site. The search engine does not maintain a database by URLs.

The approximate number of pages in the search results will be shown by the site operator, which we rely on. The number can be distorted by pages that are banned from indexing in robots.txt, but hit the search results because of external links to them.

The “Indexing Status” section in Google Search Console will show a slightly more correct number, but this data can also be distorted by the filters’ application.

Updated 07/03/2018 11:44


≈ 1 per day
Data received from an Yandex.Metrika counter

Technical SEO

Robots.txt file found. The site is allowed for indexing.The robots.txt file was found, but the home page is not allowed to be indexed by bots:

  • ahrefsbot
  • semrushbot
  • dotbot
  • aipbot
  • ia_archiver
  • and also 99 bots


    A robots.txt file is a list of restrictions for search robots or bots that visit a site and crawl information there. Before crawling and indexing your site, all robots go to the robots.txt file and look for rules.

    The robots.txt file is located in the root catalogue of the site. It must be available on the URL:

    There are several reasons to use a robots.txt file on your site:

    1. to remove duplicate content;
    2. to hide unwanted information;
    3. to limit the indexing speed.
    Updated 07/03/2018 11:44
    XML sitemap

    At least one sitemap has been found and is available.


    A Sitemap file is a file with information about the pages on the site to be indexed. With this file you can:

    • tell search engines which pages of your site need to be indexed;
    • find out how often the information on the pages is updated;
    • understand which pages are most important to index.

    Additional Information:

    Updated 07/03/2018 11:44
    The site is not accessible via HTTPS.

    For the promotion of commercial sites, the confidentiality of the exchange of information between the server and visitors is important. This increases the loyalty of potential customers to the resource, increases the level of trust, affects the conversion and growth of positions in the search results for almost all requests.

    Updated 04/24/2018 16:05
    WWW redirect
    Redirection is configured.


    Technically, domains with www and without www are two different resources; search engines index and rank them separately, and links will have different weights. This can lead to:

    • Decrease in search results
    • Filter, because a search engine may mistake one site for a duplicate of another;
    • Problems with authorization on the site and other functionality that uses cookies;

    The problem is solved by a 301 redirect and an indication of the main mirror to the search engines. From the website promotion point of view, a domain without www is better, because it is not a third-level domain, and its length will always be shorter.

    Updated 07/03/2018 11:44
    HTML loading speed
    2.23 sec - faster than 13% of tested sites.

    Download speed directly affects behavioral factors: the faster the download, the fewer bounces. The Google robot visits slow sites less often. This affects the effectiveness of promotion, such sites are rarely indexed. Loading speed is one of the main factors in Google ranking.

    Updated 07/03/2018 11:44
    W3C validator

    Found 2 errors and 5 of warnings.


    Error free code is the code that corresponds to the W3C standards. Pages with correct code are displayed correctly in the browser, which means, they have good behavioral factors, and take higher positions in the search results.

    Additional Information:

    Updated 07/03/2018 11:44
    404 page response code
    Great, received code 404.


    While requesting a non-existent page, the server should return a 404 error which is"page not found."

    If the server is set up incorrectly and a 200 error returns, then the page exists. In this case, search engines may index all pages of the site with errors.

    Set up the site in such a way that while requesting non-existent pages a 404 response code "page not found" or a response code 410 "page deleted" appear.

    Updated 07/03/2018 11:44
    Link from 404 page
    Link from 404 page found.


    During requesting a non-existent page, the server displays a standard page with a 404 error. For the users’ convenience we recommend making a unique 404 page and adding a backlink to the site on it.

    Updated 07/03/2018 11:44
    Analytics systems

    Statistics systems on the site take into account traffic, refusals, viewing depth and many other indicators. They help to track the effectiveness of promotion and advertising campaigns.

    Updated 07/03/2018 11:44
    Technologies used on the site


    Web servers

    JavaScript frameworks

    Programming languages


    Page Encoding

    Encoding "UTF-8"


    Due to incorrect encoding, site content may be displayed incorrectly. In addition to the fact that visitors will not like it, the site will not be indexed or will get under the search engines filter. We recommend using UTF-8 encoding to display the text on the site pages correctly. In some CMS, for example, Wordpress, files are written in this encoding, AJAX also supports only UTF-8.

    Don't forget to include the encoding in your meta tags: <meta charset="UTF-8" />

    Updated 07/03/2018 11:44
    Domain age
    6 years
    Young and new domains are poorly promoted in highly competitive topics. The history of the domain and site is also important. Old domains with a bad history are difficult to promote. Search engines love old, topical domains with a good history (no filters, spam, black SEO, etc.).
    Updated 07/03/2018 11:44
    Expiration Date
    The domain was extended until 07/11/2019

    Don't forget to prolong your domain name. It’s better to involve auto-renewal with your registrar. After the end of the domain registration, there is a chance to lose access to the domain.

    Updated 07/03/2018 11:44
    Server location
    Search engines consider in what country server is located. Perfect situation when server is located in the same country with your target audience.
    Updated 07/03/2018 11:44
    Data center
    Beget Ltd

    Sign up to see the full report.

    Sign up

    In-page SEO

    Website Audit

    Website scan not started.

    Audit of website pages:

    • errors of pages responses;
    • see broken links;
    • problems with meta-tags.
    Start crawl
    Meta Title

    Новостной портал «Euroinfo»

    Length: 27 characters.


    A webpage title is a headline of the webpage in search results. A title is one of the main indicators of relevance considered by search engines. The title must contain keywords but may not contain a website name as the crawler (search robot) already knows the domain name.

    Updated 07/03/2018 11:44
    Meta Description

    Новостной портал Euroinfo предоставляет самые актуальные новости мира и Украины. Новости сегодня, политика, происшествия, экономика, общество, спорт.

    Length: 149 characters.

    **Description: **

    Description is a tag that is used to describe a page for a search engine crawler, users do not see it. It should describe the content of the page correctly, because search engines often use the text from the Description to compose a snippet. It is better to place the keywords to the top, the text should not be repeated in other parts of the page. In the page code, the tag is placed between <head> and </head>. The optimal text size in the Description tag is 150-300 symbols for Google and up to 160 symbols for Yandex. It does not influence on SEO directly, while CTR depends on a good description.

    Updated 07/03/2018 11:44
    H1: 1
    H2: 8
    H3: 6
    H4: 0
    H5: 0
    H6: 0

    Более восьми миллионов украинцев выбрали семейного врача
    В США создали самый быстрый в мире суперкомпьютер

    The H1-H6 headers are responsible for the structure of the page's content. You need to highlight them in the layout to help the reader navigate the text. Headings are important for search engine promotion because search engines use them to determine what is on the page and how relevant it is. Arrange headers according to hierarchy and do not link to them.

    Updated 07/03/2018 11:44
    Text length

    3,250 characters


    The relevance of the content is more important than the length of the text for search engines. Choose the amount of text depending on the topic and purpose, focus on the competitors’ materials. The optimal text length is 1000-2000 words for two or three promoting keywords / phrases.

    Updated 07/03/2018 11:44
    Number of words

    302 words

    For search engines, content relevance is more important than text length. Choose the amount of text depending on the topic and purpose, focus on the materials of competitors. The optimal text length is 1000-2000 words for two or three promoted keywords/phrases.

    Updated 07/03/2018 11:44
    Keywords density (without stop words)

    Keyword density is one of the qualitative indicators of text, it shows the frequency of repetition of words in a document. "Keyword stuffing" is equal to the proportion of repeated words to the entire volume of the text.

    A high level of Keyword density is considered to be 8%. Such texts are often hard to read, they are spammed. The pages they are hosted on have a high percentage of failures. A site with a lot of texts with a high Keyword density may receive sanctions from search engines.

    Normal Keyword density is 4-6%. Almost all classical literature has this level of Keyword density.

    HTML size
    611 KB


    The optimal page size is considered to be up to 100 KB after compression. Delete unnecessary elements and use gzip compression to reduce the size.

    Updated 07/03/2018 11:44
    External links
    2 links, of which are indexed


    External links are the links from you to another site. Try not to refer to resources with incorrect information that do not relate to your topic, choose useful and authoritative ones. Do not put too many external outgoing links and do not post them on your homepage. Selling the links negatively affects promotion.

    Updated 07/03/2018 11:44
    Internal Links
    127 links, of which are indexedAmong them, 115 are indexed.


    With the help of internal links, you can influence the weight redistribution between individual pages of the resource, referring to more significant sections or articles. This weight redistribution is called linking and is used as part of internal site optimization.

    Internal links influence behavioral factors - they simplify navigation and help the user to get to the necessary section faster.

    Updated 07/03/2018 11:44
    Open Graph markup
    Not found.


    Open Graph was developed by Facebook specialists to make links to sites within the social network displayed nicely and be informative. Now Open Graph is supported by many social networks: Facebook, Twitter and messengers, for example, Telegram and Skype.

    Why to use Open Graph?

    1. for user to see relevant text and image on the link preview
    2. to improve the behavioral factors of the site - a properly designed link makes more conversions
    3. to make link snippet look like an independent post on your personal page or in the community - you do not have to add a description and pictures.

    You have to insert the Open Graph meta tags into the page code of the tag to get a beautiful site snippet.

    Updated 07/03/2018 11:44 markup

    Not found.


    Micro-markup is the semantic markup of website pages that structures data. It is based on injecting special attributes into the HTML code of the document. is a globally recognized standard that is recognized by the most popular search engines such as Google, Yandex, Yahoo and Bing.

    Pros of micro-markup:

    1. The logical structure of the information on the page helps search engines retrieve and process data.
    2. Enhanced snippets on the search results page improves click-through rates.
    Updated 07/03/2018 11:44
     Great, the site has a favicon.


    To highlight your site, use the Favicon, a special image format that appears next to your site's address in the search engine and in the address bar.

    To make browsers show your site’s icon, put it into your site’s root folder. You can assign different icons to individual pages.

    Updated 07/03/2018 11:44
    Not found.
    History data is found for this test!


    Search engines eagerly index tweets. Links in tweets are also indexed, including indirect ones (for example, through services, etc.). At the same time, Twitter is indexed by fast robots. To have an impact on website promotion in Google a link from Twitter must be indexed by search engines.

    Twitter helps to promote the site and speeds up its indexing.

    Updated 07/03/2018 11:44
    Not found.


    Search engines index links from Facebook. The most useful are the likes of the company's social page itself. The more likes the Facebook page of the promoted site gets, the more links from the pages of social network users the site will receive. The most valuable are the likes from authoritative accounts and popular pages. Sharing your Facebook page is similar in effect to backlinks.

    The system does not detect groups or profiles from Facebook. It is important to mention a link to a business page on a social network. A link to your site should also be on the social network page.

    Updated 07/03/2018 11:44

    Google PageSpeed Insights

    Avoid Plugins
    No plugins found.


    Plug-ins help the browser process the special content, such as Flash, Silverlight or Java. Most mobile devices do not support plug-ins, which leads to many errors and security violations in browsers that provide this support. In this regard, many browsers restrict the work.

    Updated 07/03/2018 11:44
    Configure Viewport
    The site is displayed correctly on all devices.


    Description for the viewport. This means that mobile devices will try to display them as on a PC, downscaling in proportion to the size of the screen. Specify the viewport tag so your site displays correctly on all devices.

    The viewport determines how a web page is displayed on a mobile device. If not specified, the page width is assumed to be the PC standard and is reduced to fit on the screen. Thanks to the viewport, you can control the page width and its scaling on different devices.

    Updated 07/03/2018 11:44
    Gzip compression
    Compression is enabled.

    Many web servers can before sending to compress files in the GZIP format, using your own procedure or third-party modules. This allows faster loading of the resources needed to display the website.

    Compressing resources with gzip or deflate function allows you to reduce the amount of data transferred over the network.

    Updated 07/03/2018 11:44
    Image compression
    Images are optimized.

    Try to reduce the size of images to a minimum, it will speed up the loading of resources. The correct format and compression of images can reduce their size. Perform basic and advanced optimization on all images. As part of the basic optimization, trim unnecessary margins, reduce the color depth to the minimum acceptable value, remove comments and save images in an appropriate format. Basic optimization can be done using any image editing software.

    The area of the screen
    Your entire page is in the screen.

    Users of PCs and mobile devices are used to perform vertical and not horizontal scrolling websites. If to view just the content you need to scroll the website horizontally or zoom out, it causes any inconvenience.

    When developing the mobile site with a meta viewport tag, you will have to position the content so that it will not fit into the specified viewport. For example, if the image is wider than the viewport, there may be a need for horizontal scrolling. To avoid this, you need to change the content so that it entirely fits.

    Updated 07/03/2018 11:44
    The screenshot of the website on your smartphone

    Website design for mobile phones solves two problems: it provides users with a comfortable viewing of the site from any device and has a positive effect on the search ranking of the site.

    Check that your site displays correctly on mobile devices.

    Updated 07/03/2018 11:44
    Font sizes
    The font size and row height on your website conveniently read the text.

    'One of the most common problems of reading of sites on mobile devices is too small font size. Have constant to scale the website to read small text, and it is very annoying. Even if the site has a mobile version or adaptive design, the problem of poor readability due to small font are not uncommon.

    Use legible font sizes to make your site more convenient.

    Updated 07/03/2018 11:44
    Your browser cache
    Caching is configured correctly..

    Due to the cache users re visiting your website, spend less time on loading pages. Caching headers should apply to all cached static resources.

    Updated 07/03/2018 11:44
    Minify Resources
    Static resources are not minify.


    The resource size can be reduced by removing unnecessary page elements such as extra spaces, line breaks, and indentation. By minifying HTML, CSS, and JavaScript, you can speed up page loading, parsing, and rendering.

    Updated 07/03/2018 11:44