hi gang. I’m having network egress settings issues with Claude Cowork. I add the sites I’d like the AI to access (they are sites I own) but they don’t work. Could it be because of where those sites are hosted? (blot.im).
Could you share the URL and the error message? It’s easier to answer when we can see the precise details.
hey there.
Message is " Access to this website is blocked by your network egress settings. You can adjust this in Settings.".
The site is skellis.net – and yes that is added to the capabilities.
It seemed to have trouble seeing the new egress settings even after a re-start. But it resolved itself when I started a new task.
So, it seems to be working (currently!).
thanks all the same.
I would contact your hosting company.
Insight here from somebody in the business. Especially in the early days of AI, pretty much none of the bots respected things like robots.txt, and they were positively abusive in terms of traffic. I’m talking about the bots accounting for 90%+ of a hosting company’s bandwidth. And that’s not delivering value to the hosting company’s customers - it’s training the AI bots so that the hosting company’s customers can be effectively cut out of the loop.
So a number of hosting companies would logically throttle, restrict, or just outright block traffic coming from those bots/IP addresses, because there wasn’t (and, AFAIK, still isn’t) a bot-respected way to say “hey, slow it down buddy.”
Earlier this year, Claude did something that’s actually useful - they split their declared user agents into two categories. One is for the main Claude spider. The other is for requests that are user-generated.
In theory, this allows hosting companies to allow the user-generated requests through, while still blocking the other stuff.
But it wouldn’t shock me if your hosting company just has them blocked outright.