You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One thing, tho, you might want to add a specific or a random one from a pool of user agent string(s). Not doing so causes every request by your checkers to send python-requests/2.xx.x to the servers you are querying.
While this might work it's poor opsec and will eventually increase the risk of being banned / endpoints being hardened / cool methods being removed as it is pretty noisy.
Of course users can monkey patch this globally by - e.g. inserting something like requests.utils.default_user_agent = lambda: 'Mozilla Firefox....' - but imho the tool should provide proper opsec or clearly state in the README that it broadcast itself as a crawler.
The text was updated successfully, but these errors were encountered:
Thanks for putting this together.
One thing, tho, you might want to add a specific or a random one from a pool of user agent string(s). Not doing so causes every request by your checkers to send
python-requests/2.xx.x
to the servers you are querying.While this might work it's poor opsec and will eventually increase the risk of being banned / endpoints being hardened / cool methods being removed as it is pretty noisy.
Of course users can monkey patch this globally by - e.g. inserting something like
requests.utils.default_user_agent = lambda: 'Mozilla Firefox....'
- but imho the tool should provide proper opsec or clearly state in the README that it broadcast itself as a crawler.The text was updated successfully, but these errors were encountered: