Apple’s Problem with Vine Isn’t Porn, It’s Hypocrisy

2013 01 28 08 53 53

Just last week Apple pulled the iOS apps for 500 px from the App Store because it was too easy for kids to fine nude images. Okay, that’s fine. Sort of. It would take several steps—including going to a desktop browser—to turn “safe search off”, but no matter. Over the weekend the new hot app/social network Vine made a 6 second porn video an “editor’s pick”. A quick search for “#porn” in Vine yields all the nudity you’d care to see. So why is Vine still in the App Store?

Or Twitter, or Safari, or Chrome, or…

As The Verge (and others) gleefully reported, a Vine clip with the (as advertised) tag #dildoplay was chosen as an Editor’s Pick on Vine. Jeez no wonder my clips of making coffee were ignored! The clip was, of course, reported and taken down. The Editor’s Pick list on my Home feed this morning (the home feed seems to be different for different users) is just normal and boring (in comparison). Which is fine for the app, but not fine for Apple.

Because, as another post on The Verge points out Apple has set itself up for the Mother of All Hypocritical Dilemmas. You boot out 500px because it’s too easy for kids to find nude images (reportedly many of which are more art than porn), while Twitter’s apps (Twitter and Vine) have easy peasy searches for hashtags that will get you lots and lots and lots of real porn in just a few clicks are okay. Is it because no one complained? Or if someone complains since Apple and Twitter have forged close ties that the complaints are ignored?

Regardless of the answers to those questions, Apple has some questions to answer on its App Store policies. Fire up any app that can search the wider Internet, and you can find porn. So why single out 500px?

There is the pragmatic part of this that Apple probably shouldn’t have pulled the 500px app. The logic that 500px could monitor all the images posted makes about as much sense as Twitter or Vine being able to do the same. There is just too much content being posted to monitor it all.

I’m fine with Apple working to keep full-on porn apps out of the App Store. No issue here. The problem is that Apple can’t just decide one app violates the rules when another app, which happens to be from a partner (Twitter and Vine) or their own (Safari), is just as guilty of the same offense. Apple needs to just allow 500px back in the App Store and frankly to start being more realistic and fair in applying App Store policies to app.

I know, realistically, that won’t happen. It’s up to Apple, who is beholden to no one, what apps are and aren’t allowed in the App Store, but this particular incident just makes Apple look all the more out of touch than usual.

Like this post? Share it!

  • Pacomacman

    Apple will no doubt work with the developer to find a solution. One of my apps allowed users to save named songs to a central server and Apple rejected my app because song name could have rude words in them. This was no problem, inserted a word filter and Apple were fine with it. Apple aren’t unreasonable, they do work with developers to resolve these issues, it’s just people like to make a big deal out of nothing. In the end it’s all free publicity…

  • Alex

    There are an enormous amount of legal issues here. In the end, sadly, we’ll need regulatory solutions as self governing yealds far too many discrepancies as you’ve noted. Sad.

  • http://twitter.com/JMOBILEHITE352 JAN J*MOBILE E. HITE

    STUPID/DUMB I’D SAY

  • Kimk69

    Mother of All Hypocritical Dilemmas??, did you really think that they wouldn’t shut the app down after finding out.

  • SadPanda

    Dang. And i got excited thinking you posted the didloplay video. Scroll down and find a video about coffee :(