UK to “encourage” Apple and Google to put nudity-blocking systems on phones

Date:

Share:

[ad_1]

UK to “encourage” Apple and Google to put nudity-blocking systems on phones

The push for device-level blocking comes after the UK implemented the Online Safety Act, a law requiring porn platforms and social media firms to verify users’ ages before letting them view adult content. The law can’t fully prevent minors from viewing porn, as many people use VPN services to get around the UK age checks. Government officials may view device-level detection of nudity as a solution to that problem, but such systems would raise concerns about user rights and the accuracy of the nudity detection.

Age-verification battles in multiple countries

Apple and Google both provide optional tools that let parents control what content their children can access. The companies could object to mandates on privacy grounds, as they have in other venues.

When Texas enacted an age-verification law for app stores, Apple and Google said they would comply but warned of risks to user privacy. A lobby group that represents Apple, Google, and other tech firms then sued Texas in an attempt to prevent the law from taking effect, saying it “imposes a broad censorship regime on the entire universe of mobile apps.”

There’s another age-verification battle in Australia, where the government decided to ban social media for users under 16. Companies said they would comply, although Reddit sued Australia on Friday in a bid to overturn the law.

Apple this year also fought a UK demand that it create a backdoor for government security officials to access encrypted data. The Trump administration claimed it convinced the UK to drop its demand, but the UK is reportedly still seeking an Apple backdoor.

In another case, the image-sharing website Imgur blocked access for UK users starting in September while facing an investigation over its age-verification practices.

Apple faced a backlash in 2021 over potential privacy violations when it announced a plan to have iPhones scan photos for child sexual abuse material (CSAM). Apple ultimately dropped the plan.

[ad_2]

Source link

━ more like this

Sends shares Q1 2026 business update and product progress

Sends reported Q1 2026 updates sharing news on digital cards, app redesign, ClearBank integration, and fintech industry recognition. Sends, a fintech platform operated by Smartflow...

We swipe our phones all day, and scientists just ranked which ones are the most tiring

We all know staring at your phone for hours isn’t great for mental health. But what about your fingers? Previously, researchers couldn’t measure...

Two suspects have been arrested for allegedly shooting at Sam Altman’s house

OpenAI CEO Sam Altman's house may have been the target of a second attack after San Francisco Police Department arrested two suspects for...

You Can Soon Buy a $4,370 Humanoid Robot on AliExpress

Listing consumer electronics on the internet's large ecommerce marketplaces is a key step in “democratizing” the products, allowing them to be purchased by...
spot_img