Get in Touch

Get in Touch

Get in Touch

Please get in touch using the form below.

Close form

Home / Explore our latest insights / How to protect your data privacy and security when using ChatGPT

Published: 16th March 2023 | In: Insights

Data security and privacy is a hot topic, especially as more and more people are worried about identify theft, financial fraud and other crimes that are committed by using people’s personal data. So the launch, in November 2022, of ChatGPT has many people concerned.

If you’re not yet familiar with ChatGPT, it’s a large language model developed by OpenAI, a San Francisco-based artificial intelligence company that claims to create “safe artificial general intelligence that benefits all of humanity”. Microsoft announced the OpenAI service in an earlier blog post, in November 2021.

ChatGPT’s sophistication has impressed many experts in the technology industry, so much so in fact that after initially investing £1 billion into the company, Microsoft has upped its stake to $10 billion.

The technology is moving at a rapid pace. A new version, GPT-4, was released on 14th March and Microsoft has confirmed that its Bing search engine is already using it. New capabilities include more complex understanding models and image-to-text translation. It’s safe to say that ChatGPT is here to stay.

100 million users and counting

In October 2022 most people had never heard of OpenAI or ChatGPT, but its popularity has skyrocketed in just a few months. According the Business of Apps website, in January 2023 it set the record for the fastest growing app in history, attracting over 100 million active users in the first two months since it was launched. That’s a lot of people submitting a huge amount of information into the engine every minute for it to continue to learn from. But most wouldn’t have read OpenAI’s data policy before trying it out.

Chat GPT was trained on vast volumes of information from the internet up to the end of 2021, including from social media platforms, conversations about people’s lives, work, hopes and fears, and personal information. So, yes, it may have learnt from anything you or your organisation has written that’s available on the web, or anything that anyone else has posted about you. And yes, it’s used it without your permission.

But the tool is not up to date with real-world events. And it’s not connected to the internet, so it can’t trawl through the latest news or update you on anything from early 2022 onwards. It doesn’t know what happened yesterday.

The preview version is free and it’s become so popular that it’s often at full capacity and users have to wait in a queue before they can use it. The recently-opened premium version allows easier access.

Imperfections

While many people are singing its praises for its ability to quickly create copy on all manner of subjects or produce quality software code, others have ridiculed its obvious errors and mistakes, such as incorrectly calculating basic sums that a four-year-old could solve.

However, OpenAI has stated from the start that the model is prone to errors, sometimes “hallucinates” and shouldn’t be relied upon for accuracy. So it’s still being improved. And fast.

Data privacy concerns

So, what do you need to know if you want to use the tool but need to keep your data private and confidential?

Along with cyber security, data security is another area of our expertise, and we believe in taking the same cautious, common-sense approach to ChatGPT that we would advise to anyone using any other tool on the web. We recommend that before you do anything with it, please read OpenAI’s data policy carefully.

Of particular note are these key points:

  • Under ‘1. Personal information we collect’:

“Usage Data: We may automatically collect information about your use of the Services, such as the types of content that you view or engage with, the features you use and the actions you take, as well as your time zone, country, the dates and times of access, user agent and version, type of computer or mobile device, computer connection, IP address, and the like.”

  • Under ‘2. How we use personal information’:

“We may use Personal Information for the following purposes:

To conduct research, which may remain internal or may be shared with third parties, published or made generally available.”

In addition, please be aware that ChatGPT is not GDPR compliant or compliant with any other data privacy regulations at the current time.

“The main reason for ChatGPT over on Chat.OpenAI.com is to allow the public to test and train the model, and it does state this within their privacy documents,” explains Graham Hosking, Quorum Cyber’s Solutions Director for Compliance. “Using Microsoft’s OpenAI service, they have developed the services to have data, privacy and security at the forefront. Although the service does collect information to train the model. The model is the customers, and isn’t used by Microsoft to train or improve others.”

If you decide to try it out, when registering for the tool you’ll see that OpenAI wants your phone number and email address – so they start collecting your information from the outset. But on signing in, it will warn you with: “Conversations may be viewed by our AI trainers to improve our systems. Please don’t share any sensitive information in your conversation.”

What does ChatGPT say about data privacy in ChatGPT?

You can even go a step further if you wish and ask ChatGPT about your data privacy when using it. When we tried this, it replied with the advice:

  1. Be mindful of the information you share – avoid sharing personal information
  2. Consider using a fake name or pseudonym when interacting with ChatGPT
  3. Avoid public WiFi – instead use a secure private network
  4. If you are using ChatGPT through a third-party application or service, check the privacy settings to ensure that your data is not shared with third parties
  5. If you are concerned about your data, consider deleting your ChatGPT conversations regularly.

However, it also states that it doesn’t store user data permanently and doesn’t share it with third parties, which contradicts a statement in its own privacy policy under ‘3. Sharing and disclosure of personal information’: “In certain circumstances we may share your Personal Information with third parties without further notice to you, unless required by the law.”

Microsoft’s use of OpenAI technology

Microsoft has been quick to embed technology from OpenAI into its browser, Bing, which now comes with the slogan, ‘Ask real questions. Get complete answers. Chat and create’. It’s also embedding it into other tools including its Azure technology. So if you use any of these it’s worth the time to read the Data, privacy, and security for Azure OpenAI Service piece on their website. It includes “Azure OpenAI was designed with compliance, privacy, and security in mind; however, the customer is responsible for its use and the implementation of this technology.”

It’s important to understand all the possible uses for this exciting technology, but also to make sure your business knows it’s usage too,” says Graham. “Technology can’t always mitigate against wrongful usage, so empowering your employees through communication, awareness and training is key. Just like how you might put them through data protection training today, but how the misuse of these tools could have serious implications.

In summary

Assume that all information you submit in ChatGPT will be available in the public domain. So don’t place any sensitive, private, confidential information, corporate secrets or personally identifiable information (PII) into the chat box. This includes in any questions or prompts you enter.

Don’t input software code or extracts from contract or other sensitive documents – it could be used to further train the tool and be included in responses to other people’s prompts. If in doubt, leave it out.

If your interested in protecting your business’s data then take a look at our services and get in touch today.