“There are things known and there are things unknown and in between are the doors of perception.” — Aldous Huxley
I’m Huxley Westemeier (26’) and welcome to “The Sift,” a weekly opinions column focused on the impacts and implications of new technologies.
______________________________________________________
If something is free, you’re the product.
That adage has long been attributed to American sculptor Richard Serra in 1973, and fifty-two years later, it’s more relevant than ever.
It’s also a perfect introduction to Meta’s updated (and evil) Privacy Policy.
On June 27, the tech journalism website TechCrunch published screenshots of a Facebook pop-up notification. Users posting a Story were prompted to opt into “Cloud Processing” of their camera roll, in exchange for receiving AI restyling/collage advice. Meta could scan every image you’ve ever taken, analyzing facial features, and continue uploading those images to Meta’s server on an “ongoing basis.” The worst part? Meta noted that it would “retain and use” any personal information shared to improve its AI models. The definition of personal information was extremely vague: images, feedback, and interactions could all (legally) be allowed.
It wouldn’t be a security disaster IF Meta ensured every user received the notification. Then, it would be up to you whether Meta can have access. But that’s not what happened. According to the UK publication The Standard, various users in the last two weeks of August noticed that Meta had switched on two toggles tied to Cloud Processing features without their permission. If you’re curious whether the setting is enabled, go to Menu>Settings and Privacy>Settings>Camera Roll Sharing Suggestions. Additionally, using features like Apple’s Limited Access and Ask App Not To Track permissions will reduce the likelihood of your personal information being transmitted without permission.
By this point, we should all be terrified. I’m sure everyone knows that Facebook collects some data by default: content you interact with, messages you send EXCLUDING encrypted chats, interactions with chatbots, purchases you make, and even the amount of time spent on each post. Expanding the privacy policy to cover personal camera roll processing is entirely irresponsible. Any data that is fed into AI models has a chance to be reproduced later. For example, if you upload images of you and your dog, and later prompt a model trained on that image to create a similar picture, there might be an uncanny resemblance. While Meta claims that it isn’t using collected photos to improve AI models, according to Futurism, it also doesn’t make any promise that it won’t be used to do so in the future.
My takeaway? Be aware when you upload your personal content to any website or app. Read through privacy policies (as convoluted as they are) to be mindful of the rights you willfully decline. And, perhaps most importantly, don’t openly share information with AI services. ChatGPT, for example, has openly stated that in specific chats containing mentions of unlawful activities or self-harm, it will refer information about the user to law enforcement. Such a feature is well-intentioned, but raises crucial ethical questions about true privacy on the internet.
Here’s the tradeoff: would you rather have privacy OR have fake, gimmicky AI collages that are solely an excuse to sell advertisements based on locations and objects in your camera roll? Meta’s getting more value from processing your camera roll than you are.
Facebook is free, so by definition, YOU are the product.