AOneSol

Revenue Driven for Our Clients
2,120,240,443
 Google is emailing web admins by Search Console, letting them know to delete no index statements from their robots.txt file.

The email reads:

“Google has determined that the robots.txt file on your site contains unsupported rules” no index. “This rule has never been officially compatible with Big G and will not take effect on September 1, 2019. See how our Help Center blocks indexed Google pages. “

These messages were published a few weeks after Big G officially suspended support for the index rule.

Contact A One Sol today if you want to get video marketing services.

At this time, Googlebot continues to follow the no index directive, valid until September 1. Website owners should use alternatives.

Technically, as stated in the email, Big G never had to support the no index directive. A standard set of rules in the robots.txt file is another problem: Big G is working to resolve it.

Read also: Local SEO: What You Should Know About Enterprise Websites

Before creating a list of standard rules, it is better to rely on something other than informal ones.

These are other options for locking index pages:

  • The meta tag no index directly on the HTML code page
  • HTTP404 and 410 status codes
  • password protection
  • Forbidden the robots.txt file
  • Search Console URL Removal Tool

 

Leave a Reply

Your email address will not be published. Required fields are marked *

top