Skip to main content

My site is small and I have very few visitors yet my page view limit exceeded the numbers for the second time. It was suggested i  switch from don't throttle to standard throttling and it worked for a few months.. Now it has gone over again.

I'm not very knowledgeable in this area so if someone could please advise me how to write a no robots.txt file and where to place it on my boards, I would be extremely grateful.

Thanking you in advance.

gail

 

G
Original Post

Replies/Updates (5)

Hi there!
You'll want to go to https://empoweringcaregivers.h...la/cp/billable-usage to edit your robots.txt.   

It is currently set to custom throttling, but with nothing set in the box. As it is blank then this is the same as having no robots.txt and no throttling at all.  Has it been like this the entire time? If so, you're getting no throttling and would be better off having it set to standard.

If you do still want to put it on custom then here some things you might try

User-agent: *
Crawl-delay: 60

User-agent: *
Disallow: */search?
Disallow: */advancedSearch?

This should slow them down some.  You can even up the 60 second delay a little.

Preventing them entirely would be something like

User-agent: *
Disallow: /


Of course, doing that would keep google and bing from indexing the site altogether which might not be what you want to do.

More and better information can be found at
https://en.wikipedia.org/wiki/...s_exclusion_standard
http://www.robotstxt.org/robotstxt.html

Dave Dreezer
This action was taken by Dave Dreezer.
To follow up on this question, please click here.
×
×
×
×
×
Link copied to your clipboard.
×