by Silver Adept, practitioner of the Dark Library Arts and author of Sense, Nonsense, and Not-Sense.
So this happened. The video series Tropes vs. Women's second installment was briefly taken down from YouTube after a significant influx of "inappropriate" flagging lodged against it. Forty-five minutes after the creator of the video appealed, YouTube restored the video.
There are two salient issues here, one philosophical, one technological.
Philosophically, most creators with a female-coded name can tell you about the harassment that comes with appearing to be a woman with opinions or stories to tell, whether it is publishing houses that believe your story won't sell if it's written by a woman (Jo Rowling, Ursula LeGuin, Elaine Konigsburg, for example), trolls on your blog, the glass ceiling, the clear difference between male and female Halloween costumes, the color-coding of toys, other bloggers accusing you of not being a true fan by virtue of your gender (clearly you're just there to tease the boys, don't'cha'know). Ah, and media that generally believes that proper female roles are wife, mother, girlfriend, or dead. And that innovation means combining those roles and their associated storylines rather than attempting to break new ground and add different roles.
Tropes vs. Women, then, is not just an effective tagline, but a succinct summary of both society and the grand majority of media choices available in society, for passive viewing or interactive gaming. Gaming, in particular, is laden with tropes that put women in ineffective or subservient roles, as objects to be rescued or avenged. That's the point of the first two Damsel In Distress videos.
Incidentally, this series of videos were funded through Kickstarter, which meant, in this case, more than six thousand people pledged and provided an amount of money to see this series made. Now, making numerical comparisons between the number of funders and the number of game players in a "target demographic" for popular games makes the funders look like a tiny minority that can be safely ignored. Doing so, though, mirrors what game companies are already doing - shunning what could be a very loyal, lucrative, and "bigger on the inside" fanbase in favor of playing to the "known quantity".
With that in mind, let's talk about the technological issues. Most social sites have a marker on them that someone can use to report problematic content. For example, if you are looking through your favorite blogs and then there is something that provokes the response "HOLY SHIT WHAT IS THIS PEDO-GURO STUFF DOING ON A MY LITTLE PONY FANVID SITE", there is usually a "flag" button or icon you can use to indicate that it is inappropriate for the site.
On smaller blogs and websites, a single flag can draw the attention of the blog owner or a moderator to examine the content and make a decision to edit, delete, ban, leave it up intact, or other modly actions. For a site such as YouTube, however, where several days' worth of footage is uploaded every hour, even employing all of Amazon's Mechanical Turk wouldn't really be enough to pre-screen everything. Instead, Google relies on algorithms to make decisions on what to do about flagged content. One flag may not provoke a response, or it may get sent to the bottom of a moderating queue. One hundred flags, spread out over time, might push the video up the moderating queue so that it gets dealt with sooner rather than later. A thousand flags over an hour will probably trigger an automatic takedown. Hopefully, it still gets sent to a moderator to examine and make a decision about, but there are no guarantees.
Give yourself a gold star if you've spotted the problem with a reliance on algorithms. As Slashdot, reddit, anyone hit by Anonymous, and anyone doing or being on the receiving end of a Denial of Service attack (Distributed or otherwise) knows, if you get a lot of attention in a short amount of time, most computer systems will pull the plug on whatever is receiving the attention. (Actually, so will most societal systems, now that I think about it.)
So what brought forward this field of flags? Most likely, a decently large group of people that we would euphemistically refer to as "haters", for whom other terms exist, but their use would be either profane or problematic. When confronted with several examples that the philosophical underpinning of their favorite game or series is rooted in very female-unfriendly tropes, and that this underpinning extends all the way from Mario to Duke Nukem, their reaction is to both deny that it happens and to attempt to silence the person espousing such heterodox beliefs, for whichever justification seems to work best. (They can usually be picked from the stock of justifications used to try and silence any other minority or disliked group. Just change the names.) Enough of them flagged the video as inappropriate that the YouTube algorithm kicked in and yanked the video.
Here's the part that's really problematic. From what I gather from the article and its linked tweet, the algorithm did institute a review by a person to determine whether the content that was flagged was mistakenly or maliciously flagged. The review apparently determined that the content was in violation of YouTube Guidelines, although no specifics were given. Furthermore, such a violation placed the account in a warning state, where further flag attacks could then get the account suspended or removed. The content creator, Anita Sarkeesian, had to go through an appeals process to YouTube to get the content restored and the warning removed. It was apparently a swift process, and once re-approved, the content returned within the hour. Where, if the haters are determined enough, they can re-trigger the takedown by re-flagging, forcing another appeal and so on. The onus is on the content creator to have to keep putting their material up. Which is tiring, as those creators beset by trolls will tell you, and discouraging.
Having people monitoring everything all the time would be cost-prohibitive for YouTube, not to mention the possible legal issues. That said, if the algorithm pushes a flagging incident up to a person to review, it's incumbent on the person doing the review to be thorough in both review and explanation of decision. The system failed, this time, in forcing the creator to go to the appeal process.
This incident, though, is a microcosm of the bigger problem, the one that series like Tropes vs. Women are looking to address - privileging men as agents of narrative and subordinating women makes for problematic works, no matter how popular, and effectively reinforces cultural narratives about appropriate roles for men and women. With video games, the interactivity means an even greater missed opportunity to subvert or avert those tropes. If six thousand plus people can find a Kickstarter to talk about the issue, how many thousands more are waiting for a game that actually deals with it? And how do we guard against people misusing helper tools to malicious ends?
I have no answers, fortunately. Instead, let's keep having the conversation and developing the answers together.