this post was submitted on 17 Jul 2024
237 points (98.8% liked)

Technology

59398 readers
2734 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 27 comments
sorted by: hot top controversial new old
[–] [email protected] 109 points 4 months ago (1 children)

Good, the Impreza WRX STI is a great car. Haters gonna hate.

[–] [email protected] 80 points 4 months ago

One Los Angeles Times investigation found that Calmara couldn’t even discern inanimate objects and failed to identify “textbook images” of STIs.

So this app works by sending it photos of your junk and it tells you if you have an STI or not. And I assume one has to submit photos in order to get matched with others in order to "weed out" people. No privacy concerns there, nope.

[–] [email protected] 59 points 4 months ago* (last edited 4 months ago) (4 children)

How do bad ideas like this ever get off the ground?

[–] [email protected] 36 points 4 months ago (1 children)

Because idiots salivate any time something novel promises to confirm their biases.

[–] [email protected] 6 points 4 months ago (2 children)

The bias being what in this case?

[–] [email protected] 5 points 4 months ago

The bias is believing any nonsense about "AI". It's widespread and hegemonic at this point.

[–] [email protected] 10 points 4 months ago

Tech bro: AI😎 Investors: 😱💵

[–] [email protected] 9 points 4 months ago

Yet another AI scam is never a bad idea if you want to get funds.

[–] [email protected] 7 points 4 months ago

Stupid money

[–] [email protected] 37 points 4 months ago (1 children)

How was it supposed to work? Was it supposed to scan received dick pics of anything gross because people do have eyes they could use...

[–] [email protected] 18 points 4 months ago

It scans the photo for a soul patch

[–] [email protected] 34 points 4 months ago

How will we ever figure out who has an STI without predictive A.I.? If only there were tests.

[–] [email protected] 11 points 4 months ago

I read "daters" as "dealers" and I ran the whole gamut of emotions in about a half second.

[–] [email protected] 8 points 4 months ago

I think this is a valuable app... Not the app itself, but an API that other dating apps could link to to allow you to filter out anyone with poor enough judgement to have sent photos of their crotch to his company.

[–] [email protected] 2 points 4 months ago* (last edited 4 months ago)

I have to admit It was a solid idea, though. Dick pics should be one of the best training sets you can find on the internet and you can assume that the most prolific senders are the ones with the lowest chance of having an STI (or any real-life sexual activity).

[–] [email protected] 2 points 4 months ago

This is the best summary I could come up with:


HeHealth’s AI-powered Calmara app claimed, “Our innovative AI technology offers rapid, confidential, and scientifically validated sexual health screening, giving you peace of mind before diving into intimate encounters,” but now it’s shut down after an inquiry by the Federal Trade Commission (FTC).

The letter lays out some of the agency’s concerns with the information HeHealth relied on for its claims, including one saying that it could detect more than 10 sexually transmitted infections with up to 94 percent accuracy.

Given that most STIs are asymptomatic, according to the World Health Organization, medical professionals have questioned the reliability of the app’s tactics.

One Los Angeles Times investigation found that Calmara couldn’t even discern inanimate objects and failed to identify “textbook images” of STIs.

The FTC issued a civil investigative demand (similar to a subpoena) seeking information about Calmara’s advertising claims and privacy practices and put HeHealth on notice that it’s illegal to make health benefit claims without “reliable scientific evidence.”

The FTC said it would not pursue the investigation further since HeHealth agreed to those terms and because of “the small number of Calmara users and sales in the U.S.” But, it warned, “The Commission reserves the right to take such further action as the public interest may require.”


The original article contains 523 words, the summary contains 207 words. Saved 60%. I'm a bot and I'm open source!