Data is out of date!
The information was collected on a year ago. Sign in to update.
Login and Update
Refresh as a guest

Updated a year ago

Domain score

  • 10 successful tests
  • 4 errors
  • 9 warnings

Create a project and optimize the website

Try the full version of PR-CY: find errors on internal pages and fix them with the help of the service's tips.

Free version:

  • SEO Checking only the main page
  • 10 tool checks per day
  • Recheck Limitations

Paid version:

  • Overview of your overall SEO health
  • Comparison with competitors
  • Daily rank tracking
  • Daily update of SEO metrics
  • Create PDF reports
  • Weekly email reports
Get startedMore info

Latest events

Know about what happened on the site recently.

    Site-level parameters

    Google Indexed
    History data is found for this test!
    Changes found in the history for 1 day. First date: Sep 2021.


    It is impossible to know how many pages Google correctly indexed on the site. The search engine does not maintain a database by URLs.

    The approximate number of pages in the search results will be shown by the site operator, which we rely on. The number can be distorted by pages that are banned from indexing in robots.txt, but hit the search results because of external links to them.

    The “Indexing Status” section in Google Search Console will show a slightly more correct number, but this data can also be distorted by the filters’ application.

    Updated 09/03/2021 20:59
    Google Safe Browsing
    No data.

    Google scans websites to find infected resources, phishing pages, and other issues that degrade the search results and user experience. With this information, the search engine warns users about unsafe sites. If a site is deemed dangerous, Google might downgrade or remove it.

    Additional Information:

    Updated 09/03/2021 20:59
    Additional links
    Show links


    Technical SEO

    Analytics systems
    Not found.

    Statistics systems on the site take into account traffic, refusals, viewing depth and many other indicators. They help to track the effectiveness of promotion and advertising campaigns.

    Updated 09/03/2021 21:00
    Tag History
    Not found.

    The test shows active and previously unplugged counters of statistics systems and related sites. This information can be useful if a competitor has projects unknown to you, the statistics of which are managed from one account so you can find them.

    If something goes wrong in your counters, you can also find it out.

    Sometimes Web studios independently install counters on clients' sites and manage them from the same account as the counter of their site. You can analyze the studio's website and, thanks to this test, find out who its clients are.

    Updated 09/03/2021 21:00
    WWW redirect
    Redirection is configured.


    Technically, domains with www and without www are two different resources; search engines index and rank them separately, and links will have different weights. This can lead to:

    • Decrease in search results
    • Filter, because a search engine may mistake one site for a duplicate of another;
    • Problems with authorization on the site and other functionality that uses cookies;

    The problem is solved by a 301 redirect and an indication of the main mirror to the search engines. From the website promotion point of view, a domain without www is better, because it is not a third-level domain, and its length will always be shorter.

    Updated 09/03/2021 20:59
    Technologies used on the site


    Page Encoding

    Encoding "ISO-8859-1"


    Due to incorrect encoding, site content may be displayed incorrectly. In addition to the fact that visitors will not like it, the site will not be indexed or will get under the search engines filter. We recommend using UTF-8 encoding to display the text on the site pages correctly. In some CMS, for example, Wordpress, files are written in this encoding, AJAX also supports only UTF-8.

    Don't forget to include the encoding in your meta tags: <meta charset="UTF-8" />

    Updated 09/03/2021 21:00
    Robots.txt file found. The site is allowed for indexing.The robots.txt file was found, but the home page is not allowed to be indexed by bots:

    • ahrefsbot
    • mj12bot


    A robots.txt file is a list of restrictions for search robots or bots that visit a site and crawl information there. Before crawling and indexing your site, all robots go to the robots.txt file and look for rules.

    The robots.txt file is located in the root catalogue of the site. It must be available on the URL:

    There are several reasons to use a robots.txt file on your site:

    1. to remove duplicate content;
    2. to hide unwanted information;
    3. to limit the indexing speed.
    Updated 09/03/2021 21:00
    XML sitemap
    Not found.


    A Sitemap file is a file with information about the pages on the site to be indexed. With this file you can:

    • tell search engines which pages of your site need to be indexed;
    • find out how often the information on the pages is updated;
    • understand which pages are most important to index.

    Additional Information:

    Updated 09/03/2021 21:00
    Status Code

    Successful resource request.

      200 OK
    • Successful resource request.

    For successful indexing of the page by search bots, the HTTP response code of the server must be 200.

    Additional Information:

    • [Checking the server response of the internal pages of the site](/tools/http-response/
    • List of status codes
    Updated 09/03/2021 20:59
    HTML loading speed
    10.19 sec - faster than 0% of tested sites.0

    Download speed directly affects behavioral factors: the faster the download, the fewer bounces. The Google robot visits slow sites less often. This affects the effectiveness of promotion, such sites are rarely indexed. Loading speed is one of the main factors in Google ranking.

    Updated 09/03/2021 20:59
    IP History

    We found 1 site with the same IP address.

    IPFoundLostDomains domains. Upgrade Plan.

    IP addresses ever found on the site. And also the sites that have the same IP address.

    Updated 09/03/2021 20:59
    Server location
    United States
    Search engines consider in what country server is located. Perfect situation when server is located in the same country with your target audience.
    Updated 09/03/2021 20:59
    Data center
    404 page response code
    Great, received code 404.


    While requesting a non-existent page, the server should return a 404 error which is"page not found."

    If the server is set up incorrectly and a 200 error returns, then the page exists. In this case, search engines may index all pages of the site with errors.

    Set up the site in such a way that while requesting non-existent pages a 404 response code "page not found" or a response code 410 "page deleted" appear.

    Updated 09/03/2021 20:59
    Link from 404 page
    The link from the 404 page was not found.


    During requesting a non-existent page, the server displays a standard page with a 404 error. For the users’ convenience we recommend making a unique 404 page and adding a backlink to the site on it.

    Updated 09/03/2021 20:59

    Sign up to see the full report.

    Sign up

    In-page SEO

    Website Audit

    Website scan not started.

    Audit of website pages:

    • errors of pages responses;
    • see broken links;
    • problems with meta-tags.
    Start crawl
    Meta Title

    500 Internal Server Error

    Length: 25 characters.


    A webpage title is a headline of the webpage in search results. A title is one of the main indicators of relevance considered by search engines. The title must contain keywords but may not contain a website name as the crawler (search robot) already knows the domain name.

    Updated 09/03/2021 21:00
    Meta Description

    Length: 0 characters.

    **Description: **

    Description is a tag that is used to describe a page for a search engine crawler, users do not see it. It should describe the content of the page correctly, because search engines often use the text from the Description to compose a snippet. It is better to place the keywords to the top, the text should not be repeated in other parts of the page. In the page code, the tag is placed between <head> and </head>. The optimal text size in the Description tag is 150-300 symbols for Google and up to 160 symbols for Yandex. It does not influence on SEO directly, while CTR depends on a good description.

    Updated 09/03/2021 21:00
    H1: 1
    H2: 0
    H3: 0
    H4: 0
    H5: 0
    H6: 0

    H1Internal Server Error

    The H1-H6 headers are responsible for the structure of the page's content. You need to highlight them in the layout to help the reader navigate the text. Headings are important for search engine promotion because search engines use them to determine what is on the page and how relevant it is. Arrange headers according to hierarchy and do not link to them.

    Updated 09/03/2021 21:00
    Text length

    510 characters


    The relevance of the content is more important than the length of the text for search engines. Choose the amount of text depending on the topic and purpose, focus on the competitors’ materials. The optimal text length is 1000-2000 words for two or three promoting keywords / phrases.

    Updated 09/03/2021 21:00
    Number of words

    42 words

    For search engines, content relevance is more important than text length. Choose the amount of text depending on the topic and purpose, focus on the materials of competitors. The optimal text length is 1000-2000 words for two or three promoted keywords/phrases.

    Updated 09/03/2021 21:00
    Keywords density (without stop words)

    Keyword density is one of the qualitative indicators of text, it shows the frequency of repetition of words in a document. "Keyword stuffing" is equal to the proportion of repeated words to the entire volume of the text.

    A high level of Keyword density is considered to be 8%. Such texts are often hard to read, they are spammed. The pages they are hosted on have a high percentage of failures. A site with a lot of texts with a high Keyword density may receive sanctions from search engines.

    Normal Keyword density is 4-6%. Almost all classical literature has this level of Keyword density.

    External links

    0 links, of which are indexed


    External links are the links from you to another site. Try not to refer to resources with incorrect information that do not relate to your topic, choose useful and authoritative ones. Do not put too many external outgoing links and do not post them on your homepage. Selling the links negatively affects promotion.

    Updated 09/03/2021 21:00
    Internal Links

    0 links, of which are indexed


    With the help of internal links, you can influence the weight redistribution between individual pages of the resource, referring to more significant sections or articles. This weight redistribution is called linking and is used as part of internal site optimization.

    Internal links influence behavioral factors - they simplify navigation and help the user to get to the necessary section faster.

    Updated 09/03/2021 21:00
    Adult content

    Not found.


    The service searches the words on sites that can be classified as pornography. Search engines are struggling with showing 18+ content, so they remove sites with pornographic materials from the ranking for other requests.

    Even if you have not posted such materials on the site, they may appear as a result of a hacked site or in the comments.

    What to do:

    • do not use words, images and videos of a pornographic nature;
    • check ads on your site, switch off showing 18+ ads if they are shown by default;
    • check users’ reviews and comments;
    • if you link to another site, make sure that there is no adult content and links to such sites.
    Updated 09/03/2021 21:00
    Open Graph markup
    Not found.


    Open Graph was developed by Facebook specialists to make links to sites within the social network displayed nicely and be informative. Now Open Graph is supported by many social networks: Facebook, Twitter and messengers, for example, Telegram and Skype.

    Why to use Open Graph?

    1. for user to see relevant text and image on the link preview
    2. to improve the behavioral factors of the site - a properly designed link makes more conversions
    3. to make link snippet look like an independent post on your personal page or in the community - you do not have to add a description and pictures.

    You have to insert the Open Graph meta tags into the page code of the tag to get a beautiful site snippet.

    Updated 09/03/2021 21:00 markup

    Not found.


    Micro-markup is the semantic markup of website pages that structures data. It is based on injecting special attributes into the HTML code of the document. is a globally recognized standard that is recognized by the most popular search engines such as Google, Yandex, Yahoo and Bing.

    Pros of micro-markup:

    1. The logical structure of the information on the page helps search engines retrieve and process data.
    2. Enhanced snippets on the search results page improves click-through rates.
    Updated 09/03/2021 21:00
    The site does not have a favicon.


    To highlight your site, use the Favicon, a special image format that appears next to your site's address in the search engine and in the address bar.

    To make browsers show your site’s icon, put it into your site’s root folder. You can assign different icons to individual pages.

    Updated 09/03/2021 21:00
    Not found.
    History data is found for this test!


    Search engines eagerly index tweets. Links in tweets are also indexed, including indirect ones (for example, through services, etc.). At the same time, Twitter is indexed by fast robots. To have an impact on website promotion in Google a link from Twitter must be indexed by search engines.

    Twitter helps to promote the site and speeds up its indexing.

    Updated 09/03/2021 21:00
    Not found.


    Search engines index links from Facebook. The most useful are the likes of the company's social page itself. The more likes the Facebook page of the promoted site gets, the more links from the pages of social network users the site will receive. The most valuable are the likes from authoritative accounts and popular pages. Sharing your Facebook page is similar in effect to backlinks.

    The system does not detect groups or profiles from Facebook. It is important to mention a link to a business page on a social network. A link to your site should also be on the social network page.

    Updated 09/03/2021 21:00

    Google PageSpeed Insights