this post was submitted on 20 Aug 2024
133 points (98.5% liked)
Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ
54500 readers
816 users here now
⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules • Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
📜 c/Piracy Wiki (Community Edition):
💰 Please help cover server costs.
Ko-fi | Liberapay |
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How come those big hosters get away with such infringements? I guess they must be less popular than Megaupload and such
I don't think they are less popular.
But their whole system works different. There is not a single file there that's called Inception.h265.HDR.mkv for example
Its all just billions of g24hg54j2k7j6nb2n1n5b5j files with absolute gibberish as content. So you need the nzb files to actually get stuff out of it.
But the nzb files also don't hold any copyright infringing material in and of itself.
So copyright holders have to fight two thing at once
No matter which encoding is used to store data, the hoster is still responsible for it. On mega, the data is encrypted, yet mega is still held responsible for removing content reported by copyright holders (the decryption keys being included in reports).
They get DMCA'd regularly and content get removed on Usenet as well. But the fact that they have to report literally thousands of individual files every time make it slow and inefficient. People will just reload the same item many times and it's always there.
Each copy, each single file in which the copy is split needs to be identified and asked for removal. Compared to torrents, it's a long and complex task.