The tool will inform you if the HTTP head of the X-Robots-Tag page bans its displaying in search results.
X-Robots-Tag controls search bots and indicates what site pages should be or should not be indexed. This is a server head; that’s why search engines prioritize it over the Robots meta tag, which is often placed in the <head> of the page.
The search engine doesn’t have to spend resources downloading the main document to access the content of the X-Robots-Tag header. Besides, such a system helps to save the crawl budget. These are the many reasons why webmasters have started to increasingly use X-Robots-Tag.
Real files available on servers such as images and documents are closed in .htaccess. Dynamic pages generated by any CMS can be closed only through php.
Here is an example on how to close pdf files in .htaccess with the help of X-Robots-Tag:
<FilesMatch ".pdf $">
Header set X-Robots-Tag "noindex, nofollow"
header ("X-Robots-Tag: noindex, nofollow");