Google officially announced that GoogleBot will no longer obey a Robots.txt directive related to indexing. Publishers relying on the robots.txt noindex directive have until September 1, 2019 to remove it and begin using an alternative.

Robots.txt Noindex Unofficial

The reason the noindex robots.txt directive won’t be supported is because it’s not an official directive.

Google has in the past supported this robots.txt directive but this will no longer be the case. Take due notice therof and govern yourself accordingly.

Google Mostly Used to Obey Noindex Directive

StoneTemple published an article noting that Google mostly obeyed the robots.txt noindex directive.

Their conclusion at the time was:

“Ultimately, the NoIndex directive in Robots.txt is pretty effective. It worked in 11 out of 12 cases we tested. It might work for your site, and because of how it’s implemented it gives you a path to prevent crawling of a page AND also have it removed from the index.

That’s pretty useful in concept. However, our tests didn’t show 100 percent success, so it does not always work.”

That’s no longer the case. The noindex robots.txt directive is no longer supported. Read more..