Should robots.txt be 301'd on an HTTPS-only site? - WebmasterWorld

No http is allowed and we provide HSTS headers on all requests. Now we tried to look at backlinks in Majestic, and as you know this requires ...

Can I use robots.txt to only allow https crawls and disallow http?

You can use robots. txt, but it not as simple as 'code' in the file. You would have to arrange for different robots. txt to be served on http ...

Custom Result

This is a custom result inserted after the second result.

Is there a way to disallow crawling of only HTTPS in robots.txt?

There is no way to do it in robots.txt itself as served over HTTP. You could serve a different robots file entirely for secure HTTPS ...

301 and robots.txt - Apache Web Server forum at WebmasterWorld

Despite the fact that my files have moved temporarily, i still implemented a 301 redirect because 302 redirects seem to be very bad for SE.

Robots txt. in page with 301 redirect | SEO Forum - Moz

We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are ...

How to setup robots.txt when moving from http to https - Quora

You should care about few things like, SSL certificate is properly installed on your website and 301 redirection is working fine. you will have ...

Forum Questions My Q&A Users Ask the Community - Moz

Hi SEOmoz, I have a problem with my Joomla site (yeah - me too!). I get a large amount of /index.php/ urls despite using a program to handle ...

Does Google Respect Robots.txt NoIndex and Should You Use It?

Ultimately, the NoIndex directive in Robots.txt is pretty effective. It worked in 11 out of 12 cases we tested. It might work for your site, and ...

How Google Interprets the robots.txt Specification

txt with an IP-address as the host name is only valid for crawling of that IP address as host name. It isn't automatically valid for all websites hosted on that ...

Issue - googlebot being blocked by 301 - Plesk Forum

I'm having a strange issue where google can see robot.txt exists but cannot always read it. I've tried all of the obvious like encoding, ...