Can ChatGPT-4o Be Trusted With Your Private Data?

Date:

Share:


Open AI says this data is used to train the AI model and improve its responses, but the terms allow the firm to share your personal information with affiliates, vendors, service providers, and law enforcement. “So it’s hard to know where your data will end up,” says Love.

OpenAI’s privacy policy states that ChatGPT does collect information to create an account or communicate with a business, says Bharath Thota, a data scientist and chief solutions officer of analytics practice at management consulting firm Kearney, which advises firms on managing and using AI data to power new revenue streams.

Part of this data collection includes full names, account credentials, payment card information, and transaction history, he says. “Personal information can also be stored, particularly if images are uploaded as part of prompts. Likewise, if a user decides to connect with any of the company’s social media pages like Facebook, LinkedIn, or Instagram, personal information may be collected if they’ve shared their contact details.”

OpenAI uses consumer data like other big tech and social media companies, but it does not sell advertising. Instead, it provides tools—an important difference, says Jeff Schwartzentruber, senior machine learning scientist at security firm eSentire. “The user input data is not used directly as a commodity. Instead, it is used to improve the services that benefit the user—but it also increases the value of OpenAI’s intellectual property.”

Privacy Controls

Since its launch in 2020 and amid criticism and privacy scandals, OpenAI has introduced tools and controls you can use to lock down your data. OpenAI says it is “committed to protecting people’s privacy.”

For ChatGPT specifically, OpenAI says it understands users may not want their information used to improve its models and therefore provides ways for them to manage their data. “ChatGPT Free and Plus users can easily control whether they contribute to future model improvements in their settings,” the firm writes on its website, adding that it does not train on API, ChatGPT Enterprise, and ChatGPT Team customer data by default.

“We provide ChatGPT users with a number of privacy controls, including giving them an easy way to opt out of training our AI models and a temporary chat mode that automatically deletes chats on a regular basis,” OpenAI spokesperson Taya Christianson tells WIRED.

The firm says it does not seek out personal information to train its models, and it does not use public information on the internet to build profiles about people, advertise to them, or target them—or to sell user data.

OpenAI does not train your models on audio clips from voice chats—unless you choose to share your audio “to improve voice chats for everyone,” the Voice Chat FAQ on OpenAI’s website notes.

“If you share your audio with us, then we may use audio from your voice chats to train our models,” Open AI says in its Voice Chats FAQ. Meanwhile, transcribed chats may be used to train models depending on your choices and plan.



Source link

━ more like this

Midjourney adds AI video generation

AI company Midjourney has its first video model. This initial take on AI-generated video will allow users to animate their images, either...

Steam adds more accessibility features

Steam has a new batch of features for improved accessibility in the latest beta of the gaming client. The platform is adding...

The golden Trump Phone is almost certainly not made in the US

Not content with a real estate empire and the presidency of the United States, the Trump family is wading into the phone wars...

The Fairphone 6 leaks ahead of its rumored late June launch

The Fairphone 6 is widely expected to get an official unveiling in the next week but the leadup to the repairable smartphone’s launch...

Ryanair Boeing 737 from the UK crashes on runway – London Business News | Londonlovesbusiness.com

A Ryanair Boeing 737 from the UK to Greece has crashed on a runway leaving passengers terrified. The Ryanair flight crashed into a barrier...
spot_img