Why Meta is getting sued over its beauty filters
I have interviewed a lot of cosmetic surgeons and cosmetic injectors over the years who tell me that patients use the filters and photo-editing tools that are really popular on Instagram, but maybe not owned by Meta or Instagram, to alter their own images and bring that in to a plastic surgeon or an injector consult and say, This is what I wanna look like.
I tell this story all the time because it was just so shocking to me and such a strong example of what’s happening in the medical world in response to Instagram filters: I was interviewing this cosmetic injector, a doctor and dermatologist named Anna Guanche, at an event hosted by Allergan, the makers of Botox Cosmetic, with a small group of journalists.
She said, “One of the biggest things I tell my patients is, ‘You want to look more like your filtered photos—what can we do to make you look more like them, so people don’t see you in real life and go, what?’”
So that is a medical opinion that’s being given by an actual doctor to clients. And of course, all of these behaviors and the surgeries that are being performed in response to Instagram filters come with a huge host of potential side effects and risks, including deaths.
One thing that was specifically named in the case is that Meta promotes platform features such as visual filters known to promote eating disorders and body dysmorphia in youth. Do we know that this is true?
We do know that this is true, I would say, and it’s true because these platforms are engineered by people, and that these biases exist in people is very well documented. There are very well-documented cases of these biases popping up in some of the filter technology.
For instance, filters that are literally called “beauty filters” will automatically give somebody a smaller nose, slightly lighten and brighten their skin, and widen their eyes. These are all beauty preferences that are passed down from systems of patriarchy, white supremacy, colonialism, and capitalism that end up in our lives, in our systems, in our corporations, and in our engineers and the filters that they create.
These issues are often talked about in the context of women and teen girls being insecure about their bodies rather than framed as untested, mass-deployed, sophisticated consumer-facing augmented-reality tech. Have you seen that dynamic play out?
Issues [that affect] teen girls have culturally, historically, been swept under the rug and dismissed. Things like beauty are seen as frivolous interests. And if they’re dismissed, we end up not getting enough studies, enough data about the harms of beauty culture, when in reality there are these huge and harmful cultural implications.
It recently came out that period products have never been scientifically tested using blood, and periods have been around since the beginning of time. If periods, which have affected teen girls and women for literal millennia, are understudied, it does not surprise me that this relatively new phenomenon of beauty filters and beauty standards affecting the mental health of teen girls does not have a robust set of data yet.