I check if a user agent has gptbot, and if it does I 302 it to web.sp.am.
Science Memes
Welcome to c/science_memes @ Mander.xyz!
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
- Don't throw mud. Behave like an intellectual and remember the human.
- Keep it rooted (on topic).
- No spam.
- Infographics welcome, get schooled.
This is a science community. We use the Dawkins definition of meme.
Research Committee
Other Mander Communities
Science and Research
Biology and Life Sciences
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- !reptiles and [email protected]
Physical Sciences
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- [email protected]
Humanities and Social Sciences
Practical and Applied Sciences
- !exercise-and [email protected]
- [email protected]
- !self [email protected]
- [email protected]
- [email protected]
- [email protected]
Memes
Miscellaneous
When I was a kid I thought computers would be useful.
They are. Its important to remember that in a capitalist society what is useful and efficient is not the same as profitable.
I'm pretty sure no one knows my blog and wiki exist, but it sure is popular, getting multiple hits per second 24/7 in a tangle of wiki articles I autogenerated to tell me trivia like whether the Great Fire of London started on a Sunday or Thursday.
Funny that they’re calling them AI haters when they’re specifically poisoning AI that ignores the do not enter sign. FAFO.
First Albatross, First Out
Fluffy Animal's Fecal Orifice.
AI is the "most aggressive" example of "technologies that are not done 'for us' but 'to us.'"
Well said.
Deployment of Nepenthes and also Anubis (both described as "the nuclear option") are not hate. It's self-defense against pure selfish evil, projects are being sucked dry and some like ScummVM could only freakin' survive thanks to these tools.
Those AI companies and data scrapers/broker companies shall perish, and whoever wrote this headline at arstechnica shall step on Lego each morning for the next 6 months.
Feels good to be on an instance with Anubis
one of the united Nations websites deployed Anubis
Do you have a link to a story of what happened to ScummVM? I love that project and I’d be really upset if it was lost!
It's so sad we're burning coal and oil to generate heat and electricity for dumb shit like this.
Wait till you realize this project's purpose IS to force AI to waste even more resources.
I mean, the long term goal would be to discourage ai companies from engaging in this behavior by making it useless
This might explain why newer AI models are going nuts. Good jorb 👍
It absolutely doesn’t. The only model that has “gone nuts” is Grok, and that’s because of malicious code pushed specifically for the purpose of spreading propaganda.
The ars technica article: AI haters build tarpits to trap and trick AI scrapers that ignore robots.txt
AI tarpit 1: Nepenthes
AI tarpit 2: Iocaine
Nice ..... I look forward to the next generation of AI counter counter measures that will make the internet an even more unbearable mess in order to funnel as much money and control to a small set of idiots that think they can become masters of the universe and own every single penny on the planet.
All the while as we roast to death because all of this will take more resources than the entire energy output of a medium sized country.
I suppose this will become an arms race, just like with ad-blockers and ad-blocker detection/circumvention measures.
There will be solutions for scraper-blockers/traps. Then those become more sophisticated. Then the scrapers become better again and so on.
I don't really see an end to this madness. Such a huge waste of resources.
there is an end: you legislate it out of existence. unfortunately the US politicians instead are trying to outlaw any regulations regarding AI instead. I'm sure it's not about the money.
Could you imagine a world where word of mouth became the norm again? Your friends would tell you about websites, and those sites would never show on search results because crawlers get stuck.
No they wouldn't. I'm guessing you're not old enough to remember a time before search engines. The public web dies without crawling. Corporations will own it all you'll never hear about anything other than amazon or Walmart dot com again.
Nope. That isn't how it worked. You joined message boards that had lists of web links. There were still search engines, but they were pretty localized. Google was also amazing when their slogan was "don't be evil" and they meant it.
I was there. People carried physical notepads with URLs, shared them on BBS', or other forums. It was wild.
This is surely trivial to detect. If the number of pages on the site is greater than some insanely high number then just drop all data from that site from the training data.
It's not like I can afford to compete with OpenAI on bandwidth, and they're burning through money with no cares already.
Yeah sure, but when do you stop gathering regularly constructed data, when your goal is to grab as much as possible?
Markov chains are an amazingly simple way to generate data like this, and a little bit of stacked logic it's going to be indistinguishable from real large data sets.
Imagine the staff meeting:
You: we didn't gather any data because it was poisoned
Corposhill: we collected 120TB only from harry-potter-fantasy-club.il !!
Boss: hmm who am I going to keep...
Some details. One of the major players doing the tar pit strategy is Cloudflare. They're a giant in networking and infrastructure, and they use AI (more traditional, nit LLMs) ubiquitously to detect bots. So it is an arms race, but one where both sides have massive incentives.
Making nonsense is indeed detectable, but that misunderstands the purpose: economics. Scraping bots are used because they're a cheap way to get training data. If you make a non zero portion of training data poisonous you'd have to spend increasingly many resources to filter it out. The better the nonsense, the harder to detect. Cloudflare is known it use small LLMs to generate the nonsense, hence requiring systems at least that complex to differentiate it.
So in short the tar pit with garbage data actually decreases the average value of scraped data for bots that ignore do not scrape instructions.
Such a stupid title, great software!
Wait… I just had an idea.
Make a tarpit out of subtly-reprocessed copies of classified material from Wikileaks. (And don’t host it in the US.)
Btw, how about limiting clicks per second/minute, against distributed scraping? A user who clicks more than 3 links per second is not a person. Neither, if they do 50 in a minute. And if they are then blocked and switch to the next, it's still limited in bandwith they can occupy.