I caught this tweet earlier tonight – I haven’t seen much confirmation on this other than anecdotal evidence on Twitter, but given that this is a recurring wheelhouse for me, I figured it’s worth bringing to bear my flashlight on the issue.
I’ve written almost dozens of editorials over the years concerning the very troubling track record YouTube and Google have with it’s terms of service and censorship policies. Google seems to have gotten on the right track in recent months, but the YouTube division remains as troubled as ever.
I won’t re-hash my whole back-story here. Instead, here are a list of links you can browse at your leisure.
- More Hypocritical YouTube Censorship
- YouTube Censors Egyptian Police Brutality
- Google: We Won’t Censor Anti-Semitism
- Lieberman and bin Laden Get It; Why Doesn’t YouTube?
- Terrorists No Longer Invited to Broadcast Themselves
- YouTube: Broadcast Terrorism Yourself
- Israel’s Shin Bet Warns of Arab Terror Recruiting on Facebook
There are many others, but you get the idea – I’ve talked about this a lot.
If you do a Twitter search using the terms ‘youtube’ and ‘removed,’ you’ll come up with hundreds of tweets from folks who’ve ostensibly had their videos of the riots in Iran removed. This points to a larger pattern of removal, and based on what I’m reading, it seems to center around description and title keyword matches around words like “beating,” “death,” and “killed.”
There might be other terms, and if this is in fact the case as to why the videos are being so quickly removed, it’s a new tactic to me.
My educated guess is that one of several things could be happening (listed in no particular order):
- YouTube is caving to certain takedown requests from the Iranian government to avoid being blocked (or re-blocked) in the country. I find this to not be a likely cause, but it wouldn’t surprise me based on past performances by Google (particularly in regard to the Egyptian-police brutality videos).
- YouTube is autoflagging these and taking them down due to specific “violent” keywords mentioned in the descriptions. This, also, seems slightly unlikely, since much violent content already exists on the site (from video games to bum fights).
- YouTube’s anti-Sharia bury brigade’s at work. Remember when I severely chastised Google and YouTube to not heeding Lieberman’s demands to get Al-Qaeda operatives off YouTube? Supposedly, Google obliged under duress, but apparently has since gotten lax in their duties. I received an invitation earlier this evening to join the “YouTube SMACKDOWN Corps.” It’s a group of self-appointed vigilantes that go and flag terrorist recruiting videos as “objectionable” to get them removed. I’m not sure how savvy this group is, and if they’ve caught Iranian protester videos in their net. When a video is flagged a few times, an age filter is put on it. When it’s flagged enough, it’s removed pending review.
- The videos are being legitimately taken down due to violent content. Violent content is prohibited by the YouTube terms of service. The enforcement of this is arbitrary and capricious, but since no effort to obfuscate the the violence in these videos is made (and they’re unbelievable violent – several show protesters beaten to death on camera), they could be getting taken down through legitimate means.
Whatever the reason is – as soon as word of this spreads, it’s going to be a major PR stain on YouTube’s record. Up until this point, it’s been me and only a couple of other vocal editorialists in the tech blogosphere calling attention to YouTube’s ridiculous terms of service and censorship enforcement.
The bottom line is that YouTube needs to come up with a coherent policy, and enforceable policy, and stick to it. When it comes to infringing copyrighted material, they waste no time taking it down. When it comes to enforcing censorship and important videos under the terms of service in a way that makes sense, YouTube is all over the map.
That’s all I’ve ever said: just make some sense, YouTube. Will this be the time you actually listen to me?
Update: It appears there’s a movement to mark the videos as “adult” so that when they’re flagged “objectionable” they aren’t removed completely. I’m not sure that this method works – is there any confirmation on this?