robots.txt file how to optimize for wordpress?
The robots.txt file is a very powerful tool when you’re working on a website’s SEO – but it should be handled with care.
It allows you to deny search engines access to different files and folders, but often that’s not what you want to do these days.
Over the years Google especially has changed a lot in how it crawls the web, so often what used to be best practice a few years ago doesn’t work anymore.
This post outlines current best practice for your WordPress robots.txt file and explains why you should adopt it.
Google now fully renders your site
It fetches everything and renders your pages completely.
The previous best practice of blocking access to your
wp-includes directory and your plugins directory via
robots.txt is no longer valid, which is why, in WordPress 4.0.
Robots.txt denies links their value
There’s something else that’s very important to remember: if you use your site’s
robots.txt to block a URL, search engines won’t crawl it.
This also means that they can’t distribute the link value pointing at blocked URLs.
So if there’s an area of your site that has a lot of links pointing at it but you’d rather not have appeared in search results, don’t block it via
robots.txt, use a robots meta tag with a value of instead
This allows search engines to properly distribute the link value for those pages across your site.
Our WordPress robots.txt example
So, what should be in your WordPress robots.txt? Ours is very clean now – we block almost nothing! This means we don’t block our
And we also don’t block our
We also don’t block our
The reason is simple: if you block it, but link to it somewhere by chance, people will still be able to do a simple
[inurl:wp-admin] query in Google and find your site – just the kind of query malicious hackers love to do. Now, WordPress has a robots meta x-http header on the admin pages that prevents search engines from displaying them in search results, which is a much cleaner solution.
What you should do with your
Log into Google Search Console and under Crawl → Fetch as Google, use the Fetch and Render option:
If it doesn’t look the same as when viewing your site in a browser, or it throws errors or notices, fix them by removing the lines in your file
robots.txt that block access to the URLs identified in the notices.
Should you link to your XML Sitemap from yours
We’ve always felt it pointless to link to your XML sitemap from your file
robots.txt, because you should add your sitemap manually to your Google Search Console and Bing Webmaster Tools accounts and look at their feedback about it. This is why Yoast SEO plugin doesn’tt add it to your
Don’t rely on search engines finding out about your XML sitemap through your
I might start with my blogs too. I am trying to promote my café on social media but it seems like a good seo company will be able to help me better. I plan to take my café on top in the SERPs and I am also looking forward to make my website more attractive.