sorry-about-the-mess.co.uk
Domain score
- 30 successful tests
- 3 errors
- 3 warnings
Create a project and optimize the website
Try the full version of PR-CY: find errors on internal pages and fix them with the help of the service's tips.
- Overview of your overall SEO health
- Comparison with competitors
- Daily rank tracking
Site-level parameters
Description:
It is impossible to know how many pages Google correctly indexed on the site. The search engine does not maintain a database by URLs.
The approximate number of pages in the search results will be shown by the site operator, which we rely on. The number can be distorted by pages that are banned from indexing in robots.txt, but hit the search results because of external links to them.
The “Indexing Status” section in Google Search Console will show a slightly more correct number, but this data can also be distorted by the filters’ application.
Google scans websites to find infected resources, phishing pages, and other issues that degrade the search results and user experience. With this information, the search engine warns users about unsafe sites. If a site is deemed dangerous, Google might downgrade or remove it.
Additional Information:
Rank — 50 / 100
The button shows the current PR-CY rank indicator and allows you to quickly go to the analysis of the website.
Description:
PR-CY Rank is a rating for assessing the prospects of sites as donors for link building. We analyze traffic and trust parameters while forming the rating, as well as the link profile of the site.
Influence - the potential of the site's influence on promotion. If the influence is weak, then both the negative effect (if the rating is low) and positive (if the rating is high) will be weak, and vice versa. The influence potential is based on the size of the site's regular audience.
Link factor - calculation is based on the ratio of incoming and outgoing links to the site, values of Trust Rank, Domain Rank, etc.
Traffic factor - calculation is based on the volume and dynamics of traffic (negative dynamics spoils the rating, positive dynamics - increases).
Trust Factor - analyzes many parameters, such as "ICS", the part of search traffic in total traffic, adaptation for mobile devices and many other factors recognized by search engines as significant for ranking.
Traffic
Traffic sources
Direct traffic | ### |
Email | ### |
Links on sites | ### |
Social networks | ### |
Search engine systems | ### |
Paid search | ### |
Social traffic
No data |
Geography of visitors
Traffic Rank
Similar sites
Technical SEO
- http://sorry-about-the-mess.co.uk301 Moved Permanently
- https://sorry-about-the-mess.co.uk/200 OK
- Successful resource request.
For successful indexing of the page by search bots, the HTTP response code of the server must be 200.
Additional Information:
- [Checking the server response of the internal pages of the site](/tools/http-response/
- List of status codes
Description
The pages that are closed from indexing can not be ranked in search results. Therefore, it is important to ensure that your webpage is accessible and open for indexing.
Description:
A robots.txt file is a list of restrictions for search robots or bots that visit a site and crawl information there. Before crawling and indexing your site, all robots go to the robots.txt file and look for rules.
The robots.txt file is located in the root catalogue of the site. It must be available on the URL: pr-cy.io/robots.txt
There are several reasons to use a robots.txt file on your site:
- to remove duplicate content;
- to hide unwanted information;
- to limit the indexing speed.
At least one sitemap has been found and is available.
Description:
A Sitemap file is a file with information about the pages on the site to be indexed. With this file you can:
- tell search engines which pages of your site need to be indexed;
- find out how often the information on the pages is updated;
- understand which pages are most important to index.
Additional Information:
For the promotion of commercial sites, the confidentiality of the exchange of information between the server and visitors is important. This increases the loyalty of potential customers to the resource, increases the level of trust, affects the conversion and growth of positions in the search results for almost all requests.
Description:
Technically, domains with www and without www are two different resources; search engines index and rank them separately, and links will have different weights. This can lead to:
- Decrease in search results
- Filter, because a search engine may mistake one site for a duplicate of another;
- Problems with authorization on the site and other functionality that uses cookies;
The problem is solved by a 301 redirect and an indication of the main mirror to the search engines. From the website promotion point of view, a domain without www is better, because it is not a third-level domain, and its length will always be shorter.
Download speed directly affects behavioral factors: the faster the download, the fewer bounces. The Google robot visits slow sites less often. This affects the effectiveness of promotion, such sites are rarely indexed. Loading speed is one of the main factors in Google ranking.
Description:
While requesting a non-existent page, the server should return a 404 error which is"page not found."
If the server is set up incorrectly and a 200 error returns, then the page exists. In this case, search engines may index all pages of the site with errors.
Set up the site in such a way that while requesting non-existent pages a 404 response code "page not found" or a response code 410 "page deleted" appear.
Description:
During requesting a non-existent page, the server displays a standard page with a 404 error. For the users’ convenience we recommend making a unique 404 page and adding a backlink to the site on it.
Statistics systems on the site take into account traffic, refusals, viewing depth and many other indicators. They help to track the effectiveness of promotion and advertising campaigns.
We found 1 site with same tag.
The test shows active and previously unplugged counters of statistics systems and related sites. This information can be useful if a competitor has projects unknown to you, the statistics of which are managed from one account so you can find them.
If something goes wrong in your counters, you can also find it out.
Sometimes Web studios independently install counters on clients' sites and manage them from the same account as the counter of their site. You can analyze the studio's website and, thanks to this test, find out who its clients are.
Description:
Due to incorrect encoding, site content may be displayed incorrectly. In addition to the fact that visitors will not like it, the site will not be indexed or will get under the search engines filter. We recommend using UTF-8 encoding to display the text on the site pages correctly. In some CMS, for example, Wordpress, files are written in this encoding, AJAX also supports only UTF-8.
Don't forget to include the encoding in your meta tags: <meta charset="UTF-8" />
Don't forget to prolong your domain name. It’s better to involve auto-renewal with your registrar. After the end of the domain registration, there is a chance to lose access to the domain.
We found 1 site with the same IP address.
IP addresses ever found on the site. And also the sites that have the same IP address.
Sign up to see the full report.
Sign upIn-page SEO
Website scan not started.
Audit of website pages:
- errors of pages responses;
- see broken links;
- problems with meta-tags.
Sorry About The Mess | A UK parenting and family travel blog
Length: 60 characters. Found: 1 tag.
Description:
A webpage title is a headline of the webpage in search results. A title is one of the main indicators of relevance considered by search engines. The title must contain keywords but may not contain a website name as the crawler (search robot) already knows the domain name.
Family travel, home, lifestyle, and parenting blog written by a London mum of four.
Length: 83 characters. Found: 1 tag.
**Description: **
Description is a tag that is used to describe a page for a search engine crawler, users do not see it. It should describe the content of the page correctly, because search engines often use the text from the Description to compose a snippet. It is better to place the keywords to the top, the text should not be repeated in other parts of the page. In the page code, the tag is placed between <head> and </head>. The optimal text size in the Description tag is 150-300 symbols for Google and up to 160 symbols for Yandex. It does not influence on SEO directly, while CTR depends on a good description.
The H1-H6 headers are responsible for the structure of the page's content. You need to highlight them in the layout to help the reader navigate the text. Headings are important for search engine promotion because search engines use them to determine what is on the page and how relevant it is. Arrange headers according to hierarchy and do not link to them.
8,733 characters
Description:
The relevance of the content is more important than the length of the text for search engines. Choose the amount of text depending on the topic and purpose, focus on the competitors’ materials. The optimal text length is 1000-2000 words for two or three promoting keywords / phrases.
795 words
For search engines, content relevance is more important than text length. Choose the amount of text depending on the topic and purpose, focus on the materials of competitors. The optimal text length is 1000-2000 words for two or three promoted keywords/phrases.
Keyword density is one of the qualitative indicators of text, it shows the frequency of repetition of words in a document. "Keyword stuffing" is equal to the proportion of repeated words to the entire volume of the text.
A high level of Keyword density is considered to be 8%. Such texts are often hard to read, they are spammed. The pages they are hosted on have a high percentage of failures. A site with a lot of texts with a high Keyword density may receive sanctions from search engines.
Normal Keyword density is 4-6%. Almost all classical literature has this level of Keyword density.
Description:
The optimal page size is considered to be up to 100 KB after compression. Delete unnecessary elements and use gzip compression to reduce the size.
Description:
External links are the links from you to another site. Try not to refer to resources with incorrect information that do not relate to your topic, choose useful and authoritative ones. Do not put too many external outgoing links and do not post them on your homepage. Selling the links negatively affects promotion.
Description:
With the help of internal links, you can influence the weight redistribution between individual pages of the resource, referring to more significant sections or articles. This weight redistribution is called linking and is used as part of internal site optimization.
Internal links influence behavioral factors - they simplify navigation and help the user to get to the necessary section faster.
Not found.
Description:
The service searches the words on sites that can be classified as pornography. Search engines are struggling with showing 18+ content, so they remove sites with pornographic materials from the ranking for other requests.
Even if you have not posted such materials on the site, they may appear as a result of a hacked site or in the comments.
What to do:
- do not use words, images and videos of a pornographic nature;
- check ads on your site, switch off showing 18+ ads if they are shown by default;
- check users’ reviews and comments;
- if you link to another site, make sure that there is no adult content and links to such sites.
Description:
Open Graph was developed by Facebook specialists to make links to sites within the social network displayed nicely and be informative. Now Open Graph is supported by many social networks: Facebook, Twitter and messengers, for example, Telegram and Skype.
Why to use Open Graph?
- for user to see relevant text and image on the link preview
- to improve the behavioral factors of the site - a properly designed link makes more conversions
- to make link snippet look like an independent post on your personal page or in the community - you do not have to add a description and pictures.
You have to insert the Open Graph meta tags into the page code of the tag to get a beautiful site snippet.
Found.
Description:
Micro-markup is the semantic markup of website pages that structures data. It is based on injecting special attributes into the HTML code of the document.
Schema.org is a globally recognized standard that is recognized by the most popular search engines such as Google, Yandex, Yahoo and Bing.
Pros of micro-markup:
- The logical structure of the information on the page helps search engines retrieve and process data.
- Enhanced snippets on the search results page improves click-through rates.
Description:
To highlight your site, use the Favicon, a special image format that appears next to your site's address in the search engine and in the address bar.
To make browsers show your site’s icon, put it into your site’s root folder. You can assign different icons to individual pages.
Found. — https://twitter.com/mostlychloe
Description:
Search engines eagerly index tweets. Links in tweets are also indexed, including indirect ones (for example, through goo.gl services, etc.). At the same time, Twitter is indexed by fast robots. To have an impact on website promotion in Google a link from Twitter must be indexed by search engines.
Twitter helps to promote the site and speeds up its indexing.
Found.
Description
Telegram has all the features of a social network to promote companies. It allows you to create channels analogous to groups and communities, share content on behalf of the company, communicate with brand representatives, develop the channel, and attract more subscribers.
The service searches for Telegram channels and groups links on a website. It will find them if the links are listed correctly - for example, https://t.me/prcynews. If you don't see your Telegram channel in the results, then the link may be broken.
Google PageSpeed Insights
Description:
The service measures the speed of the site and analyzes the stages of the loading process. Stages correspond to LCP, CLS, FID and others that are included in the Core Web Vitals from Google: content rendering, response time to the first user action, layout displacement due to loading elements. By optimizing these steps, you can speed up loading times and make your site more user-friendly.
Description:
Google is moving sites to mobile-first indexing, that is, it will primarily focus on the mobile display of the site. Download speed should meet standards on any device. The service analyzes the loading stages and compares the indicators with the Core Web Vitals parameter: the webmaster will be able to work on each stage and, due to this, improve the speed in general.
Description:
Description for the viewport. This means that mobile devices will try to display them as on a PC, downscaling in proportion to the size of the screen. Specify the viewport tag so your site displays correctly on all devices.
The viewport determines how a web page is displayed on a mobile device. If not specified, the page width is assumed to be the PC standard and is reduced to fit on the screen. Thanks to the viewport, you can control the page width and its scaling on different devices.
Try to reduce the size of images to a minimum, it will speed up the loading of resources. The correct format and compression of images can reduce their size. Perform basic and advanced optimization on all images. As part of the basic optimization, trim unnecessary margins, reduce the color depth to the minimum acceptable value, remove comments and save images in an appropriate format. Basic optimization can be done using any image editing software.
Website design for mobile phones solves two problems: it provides users with a comfortable viewing of the site from any device and has a positive effect on the search ranking of the site.
Check that your site displays correctly on mobile devices.
Due to the cache users re visiting your website, spend less time on loading pages. Caching headers should apply to all cached static resources.
Description:
The resource size can be reduced by removing unnecessary page elements such as extra spaces, line breaks, and indentation. By minifying HTML, CSS, and JavaScript, you can speed up page loading, parsing, and rendering.