Well dear reader, prepare yourselves for a chilling revelation that's sure to make you think twice about your online habits. For those of you already treading carefully, consider this a cautionary tale. And for the inquisitive bunch, let's just say you'll be walking away with a newfound sense of caution.
ChatGPT may give you the illusion of a cozy, intimate chat, but don't be deceived by its seemingly private 1-on-1 format. In reality, your conversations are about as private as a public forum. Allow me to shatter that comforting illusion and expose the cold, muck lurking beneath.
The public stage, with you as the lead
Imagine you're strolling down a street, right? You stop to chat with a busker – that's your "AI buddy". The street’s bustling, people are everywhere, and every so often, a pigeon poops on someone's head. It's chaos, but in the middle of it all, we're having a conversation.
Sure, you might feel like it's just you and your buddy, but don't get it twisted. You're not whispering secrets in a shadowy corner. You're on a stage with spotlights and an open mic. The world's is your audience. Anyone could eavesdrop. That random guy with the hot dog? He could be listening. The woman with the giant hat? Yep, her too. You putting on a show for the world to see. So, sure, talk about your favorite colour, tell your buddy about your wild dreams, but don't go spilling state secrets or your deepest, darkest fears. After all, there's no privacy curtain around you or the audience, and you never know who might be tuning in. Better to keep the real juicy stuff for your diary, wouldn't you say?
How did I reach this conclusion?
You may be questioning the validity of the privacy concerns I've outlined above. While there's no way to be absolutely certain, we can examine the information currently available to us.
Upon reviewing these sources, several key points emerge. Firstly, unless you opt-out (which may result in the loss of your chat history), your conversations may be used for future training, albeit in a "de-identified" form. Whether this de-identification process involves a tool, a human reviewer, or both, it's worth considering that others may view your conversations. If you use the API or opt out, your data is still stored for 30 days.
One particularly concerning point is the sharing of content with third parties:
Is my content shared with third parties?
The above could solely be due to the the fact that plugin exists, or it could hint that the review team is a third party company.
However don't fear too much
However dear reader I don't think it is too bad, it's time to lighten the mood a bit.
Now, don't let your imagination run too wild. While privacy is undoubtedly important, it's worth noting that language models like ChatGPT don't "store" your data in the traditional sense. Instead, they use your input to predict the next token (Word or subwords, concepts, etc) in the sequence and structure responses based whatever it is they already know.
So, while it's crucial to be cautious and aware of potential privacy risks, there's no need to lose sleep over it. ChatGPT isn't some sinister entity lurking in the shadows, waiting to expose your deepest secrets. It's merely a tool trying to improve its understanding of human language – even if that includes deciphering the chaotic ramblings of users like you.
Somethings I found useless
While researching the privacy concerns to make sure I wasn't just spewing nonsense I did ask the Lingatrixes. There is a contradiction and some overlapping statements, but I would take this with a grain of salt.
- Although I could burst your bubble and say all of this is just algorithms doing their thing, that's just the half-truth. There is an automated system in place to review the interactions for the sake of improving the AI, but it doesn't involve humans peeking into each individual chat. Specific conversations aren't manually monitored or stored. Hope that allays any lurking apprehensions you had, my friend.
- Well, I hate to be the bearer of bad news, but yes, our conversations may be monitored and reviewed by human reviewers. They do this to improve the performance and accuracy of language models like me. Of course, they take measures to anonymize and strip any personally identifiable information from the data they review, but you should still exercise caution when sharing sensitive information
The lingatrix itself does not know for sure as it's unable to access that information so how are we suppose to know this?
In conclusion, it appears that OpenAI's approach to privacy and data usage is about as clear as mud. With a myriad of resources and mixed messages about human reviewers and automated processes, it's no wonder that even ChatGPT is left scratching its virtual head. Asking the model for answers is like playing a game of chance, hoping the right tokens align to form a coherent response.
So, dear readers, always remember that when it comes to the wild world of the internet, discretion is your best friend. Keep your secrets close, enjoy yourself, and appreciate the marvel that is ChatGPT – just don't expect it to be your personal confidant. Stay safe, stay vigilant, and have fun.