Google’s John Mueller said on Twitter that when you want to “unindex,” remove indexed pages from Google Search, you should not block Google with robots.txt, but rather use noindex. He said only unindex pages when if you don’t care if the page doesn’t rank but if you want it to rank, then improve the page.
Here are those tweets:
Why would you need to unindex them? If they’re ranking for queries you care about, you should improve your other pages. If they’re not ranking, then ignore them. (Also, to unindex, don’t block with robots.txt, use noindex instead.)
— ???? John ???? (@JohnMu) March 8, 2021
So we see a few things from this tweet:
(1) Only unindex pages when they are not ranking for queries you care about.
(2) If you care about those queries, then improve the pages and do not unindex them.
(3) If you do decide to unindex them, then use noindex instead of robots.txt.
Yes, noindex trumps robots.txt directives.
Also, I don’t know if “unindex” is a word but hey.
Forum discussion at Twitter.