this post was submitted on 31 Oct 2024
449 points (92.1% liked)
Technology
59574 readers
4254 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's annoying, but at least this is an independent, worker owned 4 man outfit that got its start when Vice went bankrupt.
Here is the article:
For the past two years an algorithmic artist who goes by Ada Ada Ada has been testing the boundaries of human and automated moderation systems on various social media platforms by documenting her own transition.
Every week she uploads a shirtless self portrait to Instagram alongside another image which shows whether a number of AI-powered tools from big tech companies like Amazon and Microsoft that attempt to automatically classify the gender of a person see her as male or female. Each image also includes a sequential number, year, and the number of weeks since Ada Ada Ada started hormone therapy.
In 2023, after more than a year into the project which she named In Transitu, Instagram removed one of Ada Ada Ada’s self portraits for violating Instagram’s Community Guidelines against posting nudity. We can’t say for certain why Instagram deleted that image specifically and whether it was a human or automated system that flagged it because Meta’s moderation systems remain opaque, but it was at that moment that Instagram first decided that Ada Ada Ada’s nipples were female, and therefore nudity, which isn’t allowed on the platform. On Instagram, shirtless men are allowed and shirtless women are also allowed as long as they don't show nipples, so what constitutes nudity online often comes down to the perceived gender of an areola.
“I'm really interested in algorithmic enforcement and generally understanding the impact that algorithms have on our lives,” Ada Ada Ada told me in an interview. “It seemed like the nipple rule is one of the simplest ways that you can start talking about this because it's set up as a very binary idea—female nipples no, male nipples, yes. But then it prompts a lot of questions: what is male nipple? What is a female nipple?”
In Transitu highlights the inherent absurdity in how Instagram and other big tech companies try to answer that question.
“A lot of artists have been challenging this in various ways, but I felt like I had started my transition at the end of 2021 and I also started my art practice. And I was like, well, I'm actually in a unique position to dive deep into this by using my own body,” Ada Ada Ada said. “And so I wanted to see how Instagram and the gender classification algorithms actually understand gender. What are the rules? And is there any way that we can sort of reverse engineer this?”
While we can’t know exactly why any one of Ada Ada Ada’s images are removed, she is collecting as much data as she can in a spreadsheet about which images were removed, why Instagram said they were removed, and to the best of her knowledge if the images’ reach was limited.

That data shows that more images were removed further into her transition, but there are other possible clues as well. In the first image that was removed, for example, Ada Ada Ada was making a “kissy face” and squeezing her breasts together, which could have read as more female or sexual. Ada Ada Ada was also able to reupload that same image with the nipples censored out. In another image that was removed, she said, she was wearing a lingerie bra where her nipples were still visible.
“But then again, you have this one where I'm wearing nipple clamps, and that didn't do anything,” she said. “I would have expected that to be removed. I've also had another picture where I'm holding up a book, Nevada by the trans author Imogen Binnie. I’m just holding a book and that was removed.”
Ada Ada Ada also maintains a spreadsheet where she tracks how a number of AI-powered gender classifiers—Face++, face-api.js, Microsoft Azure’s Image Analysis, Amazon Rekognition, and Clarifai—are reading her as male or female.

Experts have criticized such gender classifiers for often being wrong and particularly harmful for transgender people. “You can’t actually tell someone’s gender from their physical appearance,” Os Keyes, a researcher at the University of Washington who has written a lot about automated gender recognition (AGR), wrote for Logic in 2019. “If you try, you’ll just end up hurting trans and gender non-conforming people when we invariably don’t stack up to your normative idea of what gender ‘looks like.’”
“I've definitely learned that gender classifiers are an unreliable and flawed technology, especially when it comes to trans people's gender expression,” Ada Ada Ada said. “I regularly see my algorithmic gender swing back and forth from week to week. In extension to that, it's also fascinating to see how the different algorithms often disagree on my gender. Face++ (which is a Chinese company) tends to disagree more with the others, which seems to suggest that it's also a culturally dependent technology (as is gender).”
As Ada Ada Ada told me, and as I wrote in another story published today, continually testing these classifiers also reveals how they work in reality versus how the companies that own them say they work. In 2022, well into her project, Microsoft said it would retire its gender classifier following criticism that the technology can be used for discrimination. But Ada Ada Ada was able to continue using the gender classifier well after Microsoft said it would retire it. It was only after I reached out to Microsoft for comment that it learned that she and what Microsoft said was a “very small number” of users were still able to access it because of an error. Microsoft denied them access after I reached out for comment.
Another thing that In Transitu reveals is that, on paper, Instagram has a plain policy against nudity. It states:
“We know that there are times when people might want to share nude images that are artistic or creative in nature, but for a variety of reasons, we don’t allow nudity on Instagram. This includes photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks. It also includes some photos of female nipples, but photos in the context of breastfeeding, birth giving and after-birth moments, health-related situations (for example, post-mastectomy, breast cancer awareness or gender confirmation surgery) or an act of protest are allowed.”
But in reality, Instagram ends up removing content and accounts belonging to adult content creators, sex educators, and gender nonconforming people who are trying to follow its stated rules, while people who steal adult content or create nonconsensual content game the system and post freely. As 404 Media has shown many times, nonconsensual content is also advertised on Instagram, meaning the platform is getting paid to show it to users. It’s not surprising that trying to follow the rules is hard when users struggle to reverse engineer how those rules are actually enforced, and nonsensical for people who don’t fit into old, binary conceptions of gender.