Perfecto Client Регистрация 06.08.2013 Сообщения 94 Благодарностей 5 Баллы 8 03.05.2020 #1 Hi, I need a script to test if everything is allowed for Google indexing. The most complicated one is for the Robots.txt file.
Hi, I need a script to test if everything is allowed for Google indexing. The most complicated one is for the Robots.txt file.
VladZen Administrator Команда форума Регистрация 05.11.2014 Сообщения 22 453 Благодарностей 5 913 Баллы 113 06.05.2020 #2 You can make such script in ProjecttMaker. How do you make it in usual browser?
Perfecto Client Регистрация 06.08.2013 Сообщения 94 Благодарностей 5 Баллы 8 06.05.2020 #3 In my browser I make it with this : https://chrome.google.com/webstore/detail/robots-exclusion-checker/lnadekhdikcpjfnlhnbingbkhkfkddkl For rhe robots.txt it seems a bit complex to me because there are a lot of different syntax to forbid the indexing of a page
In my browser I make it with this : https://chrome.google.com/webstore/detail/robots-exclusion-checker/lnadekhdikcpjfnlhnbingbkhkfkddkl For rhe robots.txt it seems a bit complex to me because there are a lot of different syntax to forbid the indexing of a page
lokiys Moderator Регистрация 01.02.2012 Сообщения 4 812 Благодарностей 1 187 Баллы 113 06.05.2020 #4 Perfecto сказал(а): In my browser I make it with this : https://chrome.google.com/webstore/detail/robots-exclusion-checker/lnadekhdikcpjfnlhnbingbkhkfkddkl For rhe robots.txt it seems a bit complex to me because there are a lot of different syntax to forbid the indexing of a page Нажмите, чтобы раскрыть... You just make your own rules and regex what should be checked for each parameter. Scrape robots.txt and get what's there and make regex to check if Google is allowed or disallowed.
Perfecto сказал(а): In my browser I make it with this : https://chrome.google.com/webstore/detail/robots-exclusion-checker/lnadekhdikcpjfnlhnbingbkhkfkddkl For rhe robots.txt it seems a bit complex to me because there are a lot of different syntax to forbid the indexing of a page Нажмите, чтобы раскрыть... You just make your own rules and regex what should be checked for each parameter. Scrape robots.txt and get what's there and make regex to check if Google is allowed or disallowed.