Request: Proxy Tags to stop bans

1. The problem

Google Scraper vs Individual Websites have different Proxy needs. Google Scraper needs mass quantity (100+ proxies) to reliably work because of how quickly Google bans your IP address .. Fortunately, Datacenter proxies are fine, but you need alot of them to rotate for each request. Luckily, they are cheap, and easy to find in bulk.

Individual websites .. you only need a few ISP/Residential Proxies. DataCenter proxies do not work on many individual websites now due to recent Cloudflare BOT protections you can easily add in wordpress. So lately, Many calls to websites are “blocked” from datacenter proxies … resulting so your Google Scraper cannot get logo/email/socials etc. Unfortunately, ISP/Residential proxies are expensive and hard to come by .. but you dont need many since you are visiting different websites. You only need a few to get by Cloudflare’s flooding detection.

2. Simple solution

Create Proxy Pools (or Proxy Tags IP:PORT:USER:PASS:TAG, .. tag optional) and allow assignment of pools/tags for “SERPs” vs “Individual websites

For example. GSA Proxy module (built into just about every GSA software) allows tagging to assign Proxy Tags for different targets .. for this very purpose. (this is where I got the idea from).
But the point is .. it works .. and works well.

If you were to add a similar feature .. then we can easily (and cheaply) work around most bans. If you want to make it easy to program on yourself .. Just add a setting for Google, Bing, Websites with “Allowed Proxy Tags”.

3. Screenshot examples

So 2 different pools for proxies.

One for search engines and one for others correct?

If you can also attach log of that screenshot (or similar) so I can target the correct calls and ensure proxies go the correct way.

Yes .. 2 would be just fine at the moment .. I figured tagging would be easier to program.. and make it ultimately flexible if different needs arose later.

I dont know what log you need .. but it’s just when you contact the serps vs individual websites .. the proxy needs are different. For example, in Google Maps Scraper .. you goto Google Maps .. Grab 100 businesses .. then you goto each of their websites to grab socials, logo, email, meta etc. That’s what I’m talking about. Having a pool or tag for each one. (not a pool for each website) .. just 2 pools .. one for the serps .. and one pool for ALL regular websites (which are Cloudflare-protected using wordpress plugin).

Here is the implementation of Proxy Tags for GSA Proxy

As you can see .. it gives you options to tag which proxies are useful for which targets.

Hopefully this gives some ideas/inspiration.