Friends don’t let friends use an AI STI test

Date:

Share:


Picture the scene: Your date has gone well and you and your partner might sleep together. Like any safe adult, you assume there will be a conversation about STI status and the use of protection. Now imagine how you would feel if they asked to take a photo of your penis and upload it to a website you’ve never heard of. That’s the future of intimacy, as imagined by Calmara, a new service launched by “men’s health” startup HeHealth.

HeHealth Website

Its press release suggests users take a picture of their partner’s penis so it can be run through a deep learning model for visual signs of sexually-transmitted infections. And while the website suggests users should wear protection, a banner atop the HeHealth sites describes the app as “Your intimate bestie for unprotected sex.” Mixed messages aside, you may notice some major issues with the pitch: That this only covers infections that present visually, and that it’s only designed to work with penises.

But even if that use case applies, you might not feel you can trust its conclusions once you’ve looked at the data. The Calmara website claims its scans are up to 90 percent accurate, saying its AI has been “battle-tested by over 40,000 users.” That figure doesn’t match up to its press release, which says accuracy reaches 94.4 percent (a figure cited in this NSFW preprint paper submitted a week ago), but its FAQ says the accuracy ranges “from 65 percent to 96 percent across various conditions.” We’ve reached out to the company and want to learn more about the apparent discrepancy.

Image of the Calmara website showing its claim of

Calmara

It’s not impossible for models to categorize visual information — I reported on how systems like these look at images of cells to aid drug discovery. But there are plenty of reasons as to why visual information isn’t going to be as reliable for an STI test. After all, plenty of conditions don’t have visual symptoms and carriers can often be asymptomatic long after infection. The company admits to this in its FAQ, saying that the app is a “first line of defense, not a full-on fortress.” Not to mention that other factors, like the “lighting, the particular health quirks you’re scouting for and a rainbow of skin tones might tweak those [accuracy] numbers a bit.” Even more alarming, the unpublished paper (which is riddled with typos) admits that a full 40 percent of its training dataset is comprised of “augmented” images, for instance “extracting specific visually recognizable disease patterns from the existing clinical image dataset and layering those patterns on top of images of health (sic) penises.”

Image from the Calmara FAQ highlighting the variability of its tests.

Calmara

The Calmara website’s disclaimer says that its tools are for the purpose of “promoting and supporting general wellness and a healthy lifestyle and are not to be used to diagnose, cure, treat, manage or prevent any disease or condition.” Of course, if it really was intended as a general wellness tool, it probably wouldn’t describe itself as “Your intimate bestie for unprotected sex,” would it.

It doesn’t help that this is a system asking users to send pictures of their, or their partner’s genitalia. Issues around consent and — as writer Ella Dawson raised on Bluesky — age verification, don’t seem to have been considered. The company’s promises that the data is locked in a “digital stronghold” lacks specifics about its security approach or how the data it obtains may be shared. But that hasn’t stopped the company from suggesting that it could, in future, be integrated “directly into dating apps.”

Fundamentally, there are so many red flags and potential vectors for abuse and giving users a false sense of confidence that nobody should try using it.



Source link

━ more like this

Double Fine’s Kiln pops out of the oven and onto PC, Xbox and PS5 on April 23

Double Fine is following up on Keeper — one of our favorite games of 2025 — with Kiln, a “multiplayer online pottery party...

YouTube is outsourcing its AI slop problem to you, and that’s a terrible idea

YouTube has a new plan to deal with the wave of AI-generated content flooding its platform, and it involves you. The company is...

Streeting warns meningitis outbreak is ‘serious and rapidly developing’ as another university hit – London Business News | Londonlovesbusiness.com

Canterbury Christ Church University has confirmed that one of its students has been diagnosed with meningitis, leading health authorities to issue warnings to...

Best Fitness Apps In 2026: Top Apps To Stay Fit, Track Workouts, And Reach Your Goals

The best fitness apps combine structured training programs, advanced analytics, motivational tools, and community support into a single platform. Many of them now...

Meta’s latest creator push comes with $3,000 bonuses for posting on Facebook

Stop me if you've heard this before: Meta has a new program to lure top creator talent to Facebook and it comes with...
spot_img