Robots.txt files let web robots know what to do with a website’s pages. When a page is disallowed in robots.txt, that represents instructions telling the robots to skip over those web pages completely. You just need to add the extension to Chrome and then open it when you’re on https://asmlseo.com/seo-digital-marketing/