Good morning lads and lasses!
We are are developer house making websites in WP. 9/10 times the customer already have an older webpage, this forces us to create a lot of temp URLs.
Current we have a singular Cpanel account (i have WHM access) where we store our dev sites. The URLs are always in this template; customer1.ourdomain.dev.
Running litespeed webengine. We use softaculous to install WP
Now personally I'm (1 of 2 sysadmins) having having a lot of problems with our devs being lazy or not being able to follow simple instructions and they never add a robots.txt file to the subs root. This results in our dev pages are being indexed, and most of the time our SEO beats the "old" page for our customers. I guess you can see where the issue becomes critical.
We are working with people all around the world, where some have dynamics IP, so unless i wanna whitelist IPs all day we cant use a whitelist to block everyone else.
Is there any way to block search engine crawling on absolute all subs for our root domain. I am open to plugins etc. We also store files for email signatures etc. on the dev account and therefor cant password protect them either. Optimal solution would be to only block crawlers and keep the pages "accessible" for absolute everyone, but i am open to solutions that changes that.
Yes i know i could just tell my devas to always add robot.txts, but they simply don't listen and I don't have enough power to "punish" them for their mistakes.
Appreciate all feedback.
Best regards, disgruntled sysadmin.
We are are developer house making websites in WP. 9/10 times the customer already have an older webpage, this forces us to create a lot of temp URLs.
Current we have a singular Cpanel account (i have WHM access) where we store our dev sites. The URLs are always in this template; customer1.ourdomain.dev.
Running litespeed webengine. We use softaculous to install WP
Now personally I'm (1 of 2 sysadmins) having having a lot of problems with our devs being lazy or not being able to follow simple instructions and they never add a robots.txt file to the subs root. This results in our dev pages are being indexed, and most of the time our SEO beats the "old" page for our customers. I guess you can see where the issue becomes critical.
We are working with people all around the world, where some have dynamics IP, so unless i wanna whitelist IPs all day we cant use a whitelist to block everyone else.
Is there any way to block search engine crawling on absolute all subs for our root domain. I am open to plugins etc. We also store files for email signatures etc. on the dev account and therefor cant password protect them either. Optimal solution would be to only block crawlers and keep the pages "accessible" for absolute everyone, but i am open to solutions that changes that.
Yes i know i could just tell my devas to always add robot.txts, but they simply don't listen and I don't have enough power to "punish" them for their mistakes.
Appreciate all feedback.
Best regards, disgruntled sysadmin.