this post was submitted on 30 Mar 2025
411 points (98.6% liked)

Technology

68130 readers
3612 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

BDSM, LGBTQ+, and sugar dating apps have been found exposing users' private images, with some of them even leaking photos shared in private messages.

all 49 comments
sorted by: hot top controversial new old
[–] [email protected] 118 points 2 days ago (2 children)

Brace yourselves, because this is only going to get worse with the current “vibe coding” trend.

[–] [email protected] 27 points 2 days ago (1 children)
[–] [email protected] 84 points 2 days ago (3 children)

Vibe coding is the current trend of having an LLM build your codebase for you then shipping it without attempting to understand the codebase.

Most developers are using LLMS to some extent to speed up their coding, as cursor and Claude are really good at removing toil. But vibe coders have the LLM build the entire thing and don't even know how it works.

[–] [email protected] 44 points 2 days ago (1 children)

In other words, vibe coders are today's technologically accelerated script kiddie.

That's arguably worse as the produced scripts may largely work and come with even less demand for understanding than a script kid's cobbling together of code may have demanded.

[–] [email protected] 4 points 2 days ago

100% accurate.

[–] [email protected] 9 points 2 days ago (4 children)
[–] [email protected] 24 points 2 days ago

Basically, think ChatGPT

[–] [email protected] 18 points 2 days ago (2 children)

Large language models (LLM) are the product of neural networks, a relatively recent innovation in the field of computer intelligence.

Since these systems are surprisingly adept at producing natural sounding language, and is good at create answers that sound correct (and sometimes actually happen to be) marketers have seized on this as an innovation, called it AI (a term with a complicated history), and have started slapping it onto every product.

[–] [email protected] 9 points 2 days ago

...neural networks, a relatively recent innovation in the field of computer intelligence.

Neural networks have been around for quite some time. The simplest forms of it have actually existed since around 1795.

[–] [email protected] 5 points 2 days ago

Ahhhhhh... that's a really simple explanation thanks

[–] [email protected] 4 points 2 days ago* (last edited 2 days ago)

A machine learning model that can generate text.

It works by converting pieces of text to "tokens" which are mapped to numbers in a way that reflects their association with other pieces of text. The model is fed input tokens and predicts tokens based on that, which are then converted to text.

[–] [email protected] 3 points 2 days ago* (last edited 2 days ago) (1 children)

Large Language Model

To the extent of my understanding, it is a form of slightly more sophisticated bot, as in an automated response algorithm, that is developed over a set of data, in order to have it "understand" the mechanics that make such set cohesive to us humans.

With such background, it is supposed to produce new similar outputs if given new raw data sets to run through the mechanics it acquired during development.

[–] [email protected] 1 points 2 days ago

Clever, thanks 😊

[–] [email protected] 3 points 2 days ago (1 children)

What is toil in this context?

[–] [email protected] 8 points 2 days ago

Boring/repetitive work. For example, I regularly use an AI coding assistant to block our basic loop templates with variables filled in, or have it quickly finish the multiple case statements or assigning values to an object with a bunch of properties.

In little things like that, it's great. But once you get past a medium sized function, it goes off the rails. I've had it make up parameters in stock library functions based on what I asked it for.

[–] [email protected] 3 points 2 days ago (2 children)

So we are moving away from >1GB node_modules finally? Or is it too soon?

[–] [email protected] 7 points 2 days ago (1 children)

Its going to be 1GB node_modules handled by garbage ai code
ai is only good at doing smaller scripts but loosing connections and understandment in larger codebases, combined with people who cant program well (i mean not only coding but debugging... as well) also called vibe programmers its going to be a mess

if a product claims it has vibecoding: find an alternative!

[–] [email protected] 9 points 2 days ago (1 children)

I'm losing my will to live lately at an alarming rate.

I used to love IT, way back at the start of 00s.

Soon after the 10s started, I noticed bullshit trends replacing one another... like crypto or clouds or SaaS... but now with the AI I just feel alienated. Like we're just all going to hell, and I hate the first row seating.

[–] [email protected] 4 points 2 days ago (1 children)

At this point, I think it’s required to have a sort of alternate identity online and keeping anything private, photos of yourself and other information just offline. Except for government stuff, which requires your real identity.

[–] [email protected] 2 points 1 day ago

I mean yeah, I selfhost everything, but I hate that i have to learn and support the most useless shit ever just to earn a living.

It used to be fun being a dev, now I'm just repeating the same warning phrases about technologies.

[–] [email protected] 2 points 2 days ago

I love feeding my bloated node_modules

[–] [email protected] 168 points 2 days ago* (last edited 2 days ago) (5 children)

Cybernews researchers have found that BDSM People, CHICA, TRANSLOVE, PINK, and BRISH apps had publicly accessible secrets published together with the apps’ code.

All of the affected apps are developed by M.A.D Mobile Apps Developers Limited. Their identical architecture explains why the same type of sensitive data was exposed.

What secrets were leaked?

  • API Key
  • Client ID
  • Google App ID
  • Project ID
  • Reversed Client ID
  • Storage Bucket
  • GAD Application Identifier
  • Database URL

[...] threat actors can easily abuse them to gain access to systems. In this case, the most dangerous of leaked secrets granted access to user photos located in Google Cloud Storage buckets, which had no passwords set up.

In total, nearly 1.5 million user-uploaded images, including profile photos, public posts, profile verification images, photos removed for rule violations, and private photos sent through direct messages, were left publicly accessible to anyone.

So the devs were inexperienced in secure architectures and put a bunch of stuff on the client which should probably have been on the server side. This leaves anyone open to just use their API to access every picture they have on their servers. They then made multiple dating apps with this faulty infrastructure by copy-pasting it everywhere.

I hope they are registered in a country with strong data privacy laws, so they have to feel the consequences of their mismanagement

[–] [email protected] 33 points 2 days ago (1 children)

Inexperienced? This is not-giving-a-fuck level.

[–] [email protected] 10 points 2 days ago (1 children)

No, it's lack of experience. When I was a junior dev, I had a hard enough time understanding how things worked, much less understanding how they could be compromised by an attacker.

Junior devs need senior devs to learn that kind of stuff.

[–] [email protected] 2 points 1 day ago

It does help if services that generate or store secrets and keys display a large warning that they should be kept secret, every time they're viewed, no matter the experience level of the viewer. But yeah understanding why and how isn't something that should be assumed for new devs.

[–] [email protected] 20 points 2 days ago

I've met the type who run businesses like that, and they likely do deserve punishment for it. My own experience involved someone running gray legality betting apps, and the owner was a cheapskate who got unpaid interns and filipino outsourced work to build their app. Guy didn't even pay 'em sometimes.

Granted, you could also hire inexperienced people if you're a good person with no financial investor, but that I've mostly seen with education apps and other low profit endeavors. Sex stuff definitely is someone trying to score cash.

[–] [email protected] 25 points 2 days ago (1 children)

Do you reckon this app could have been vibecoded/a product of AI? Or massive use of AI in development? I'd know not to do this as a teenager when I was beginning to tinker with making apps, nevermind an actual business.

[–] [email protected] 69 points 2 days ago (1 children)

I know for a fact that a lot of applications made these mistakes before AI was around so while AI is a possibility it is absolutely not necessary.

[–] [email protected] 40 points 2 days ago (2 children)

I had a test engineer demand an admin password be admin/admin in production. I said absolutely not and had one of my team members change it to a 64-character password generated in a password manager. Dumbass immediately logs in and changes it to admin again. We found out when part of the pipeline broke.

So, we generated another new one, and he immediately changed it back to admin again. We were waiting for it the second time and immediately called him out on the next stand-up. He said he needs it to be admin so he doesn't have to change his scripts. picard_facepalm.jpg

[–] [email protected] 22 points 2 days ago (2 children)

How is he not fired? Incompetence and ignorance is one thing, but when you combine it with effectively insubordination... well, you better be right. And he is not.

[–] [email protected] 8 points 2 days ago* (last edited 2 days ago) (1 children)

Firmly agree, I don't believe he should have had access to change these password in the first place unless I'm misunderstanding their definition of test engineer, but if OP had the authority and permission to change the password in the first place, and that person deliberately changed it back to the insecure route again, management would be involved and there would some sort of reprimandment because that's past ignorance, that's negligence

[–] [email protected] 3 points 2 days ago (1 children)

It was an admin account to do regression testing for the admin interface and functions before prod releases.

I had my guys enable/disable the account during the testing pipeline so people can't login anymore.

[–] [email protected] 1 points 2 days ago (1 children)

Why would you have regression on prod? Or why would you care what the password is on staging environments?

We have our lower environments (where all testing happens) on a VPN completely separated from prod, and testing engineers only ever touch those lower environments. Our Ops team manages all admin prod accounts, and those are completely separate from lower environment accounts.

So I guess I'm struggling to understand the issue here. Surely you could keep a crappy password for pre-prod testing? We even create a copy of prod as needed and change the admin accounts if there's something prod-specific.

[–] [email protected] 2 points 2 days ago (1 children)

The production database gets down-synced to the lower environments on demand, so they can test on actual production datasets. That would require us to manually remake this user account every time a dev down-syncs the database to a lower environment.

The customer is paranoid, as the project is their public facing website, so they want testing against the actual prod environment.

We don't mange the SSO, as that is controlled by the customer. The only local (application specific) account is this account for testing.

[–] [email protected] 1 points 2 days ago (1 children)

That would require us to manually remake this user account

That sounds fine? Just add it to the script when down-syncing. Or keep auth details in a separate DB and only sync that as needed (that's what we do).

The customer is paranoid, as the project is their public facing website, so they want testing against the actual prod environment.

That's the main problem then, not this testing engineer. We do test directly on prod, but it's not our QA engineers doing the testing, but our support staff and product owners (i.e. people who already have prod access). They verify that the new functionality works as expected and do a quick smoke test to make sure critical flows aren't totally busted. This covers the "paranoid customer" issue while also keeping engineers away from prod.

Maybe you're doing something like that now, idk, but I highly recommend that flow.

[–] [email protected] 1 points 2 days ago (1 children)

We resolved it by making him use pipeline vars for his scripts. Like we told him to do in the beginning.

He fought it because he wanted his scripts the same for all projects. Including hard coded usernames and passwords. So, it was mostly his fault.

[–] [email protected] 1 points 2 days ago (1 children)

Ah, makes a ton of sense. We do the same, basically use a .env file for local dev and OPs overrides the vars with whatever makes sense for the environment.

[–] [email protected] 3 points 2 days ago

Yeah. Since he was a subcontractor, he wanted all his scripts to be the same, no matter who the customer was.

I was like jesus christ, I'm lazy too and want to automate everything, but edit your stupid scripts to use env vars.

[–] [email protected] 5 points 2 days ago

He was a subcontractor, so technically, he's not our employee.

I bubbled it up the chain on our side, and it hasn't happened since.

[–] [email protected] 6 points 2 days ago (1 children)

my main question in this is, why does a test engineer have the credentials to change an admin password in production. Like I get that he needs to test things but I doubt he needs access to changing profile/account settings

[–] [email protected] 2 points 2 days ago

He had to do admin functionality regression tests before prod releases to make sure nothing broke.

The system uses SSO for logins for everything else.

He is a subcontractor who was using scripts for all his projects. I told him he really needs to use env vars for creds.

[–] [email protected] 6 points 2 days ago

The illusion of choice

A lot of "normal" dating apps are also owned by the same companies

[–] [email protected] 1 points 2 days ago

Every single one of those “secrets” is publicly available information for every single Firebase project. The real issue is the developers didn’t have proper access control checks.

[–] [email protected] 68 points 2 days ago

This is devastating. The LGBT community are often hiding their true selves because of family, colleagues, culture etc. People will be destroyed.

[–] [email protected] 39 points 2 days ago

I wonder how many conservative politicians they’ll find.

[–] [email protected] 13 points 2 days ago

Anyone who uses Grindr, please be aware that any photos you send are cached and stored unencrypted in plain old folders on the receiver's phone, regardless of whether they were expiring or in an album that you later revoked. It's nearly trivial to grab any photo someone sends you, with no watermark or screenshot notification.

[–] [email protected] 17 points 2 days ago* (last edited 2 days ago)

Use Signal or SimpleX for more private stuff like this 👀

[–] [email protected] -4 points 1 day ago

Just don't send nudes.... why do people think other people won't figure out how to screenshot or just keep photos forever? Even if you trust the person, the person could get hacked.... the pwned guy got pwned for Jehova's sake. Just stop sending that ~~shit~~.