500px’s Apps Removed from the App Store Over Nude Images in App Search Results [Updated]

NewImage

500px is a newcomer to the photosharing scene, but one that has received wide acclaim among photographers. Their mobile apps have also received praise for their design and just overall awesomeness. Unfortunately if you haven’t already downloaded the app, you won’t be able to for a while because Apple pulled the iOS from the App Store overnight because Apple felt it was too easy for kids to search and find nude images through the app.

The Verge has some updates on this with word from Apple.

This isn’t the first time, of course, that an app has run afoul of Apple’s guidelines (nor will it be the last). In this case, what happened (as is often the case) is that when updates to the iPhone and iPad apps were submitted Apple flagged them. 500px COO Evgeny Tchebotarev told TechCrunch:

The Apple reviewer told the company that the update couldn’t be approved because it allowed users to search for nude photos in the app. This is correct to some extent, but 500px had actually made it tough to do so, explains Tchebotarev. New users couldn’t just launch the app and locate the nude images, he says, the way you can today on other social photo-sharing services like Instagram or Tumblr, for instance. Instead, the app defaulted to a “safe search” mode where these types of photos were hidden. To shut off safe search, 500px actually required its users to visit their desktop website and make an explicit change.

[…]

More importantly, perhaps, is the fact that the “nude” photos on 500px aren’t necessarily the same types of nude images users may find on other photo-sharing communities. That is, they’re not pornographic in nature. “We don’t allow pornographic images. If something is purely pornographic, it’s against our terms and it’s deleted,” Tchebotarev notes.

Also, 500px was more than happy to make immediate changes to the app, but Apple pulled it regardless.

Heavy handed? Sure.

Problematic? Yes and no.

I think you have to give Apple credit for trying to make sure that if a kid downloads an app that they won’t see inappropriate pictures. Not that a school-sponsored trip to the art museum wouldn’t expose a child to the same kinds of nudity. For 500px’s part, it sounds like they are both trying to make it pretty hard to turn safe search off and keep pornography out of their site. Maybe Apple should have given 500px some time to update the app and let it stay there. Maybe Apple already had a stack of complaints that they were trying to deal with regarding the app and needed to do something to deal with the situation (none of the reports mention this, but since Apple doesn’t generally go into why an App is given the boot, that could certainly happen).

We don’t know. We’ll probably never know. Let’s just hope that 500px makes the changes quickly and the app can be back in the App Store toot sweet.

As for Apple’s App Store policies, I know these post is going to kick off a flurry of angry comments and vitriol about Apple banning apps, but we’ve bought into this system. And whether we like it or not, Apple tends the app garden pretty darn well. There aren’t too many weeds and certainly no(?) malware in the App Store to speak of. The same cannot be said of Google Play or Android Apps.

Once and a while a popular app runs afoul of Apple and gets pulled and people get up in arms. Some apps like Camera+ take a while to come back, but they generally come back. What we don’t read about too often are the apps that are truly bad that never make it into the App Store in the first place or the fake apps that are found and pulled out (and generally money refunded). There aren’t complaints then, so I think we have to take the overall good with the occasional bad when it comes to how Apple maintains the App Store.

Thoughts?

Updates:

According to The Verge, Apple has this to say about why the App was pulled:

The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app.

500px COO Evgeny Tchebotarev responded: “We’ve never ever, since the beginning of the company, received a single complaint about child pornography. If something like that ever happened, it would be reported right away to enforcement agencies.”

The Verge also reported that app ISO500 (made by 500px-owned Pulpfingers and uses the 500px API) has also been pulled from the App Store.

Let’s hope everything gets sorted out and both apps are back in the App Store soon.

Via: TechCrunch

  • DonQuixote

    If a kid is going to search for nude pics he will find them one way or another. Punishing a developer for this is irrational. Anyone can download any of the hundreds and hundreds of RSS reader apps, add some NSFW (like reddit gone wild) feeds and boom you don’t even need to search for it, it comes to you daily Visn it’s a great example of what I’m talking about.

    • http://trishussey.com Tris Hussey

      No argument here on that point.

  • filthyjason

    so by that rational they need to remove safari, someone could easily go to yahoo or google and search for nude pics via that app

    • http://trishussey.com Tris Hussey

      Agreed.

    • http://www.facebook.com/liamsagooch Liam Googolplex Merlyn

      I was gonna say the exact same thing

  • hohopig

    Problem is the uneven way the apps are handled. If they did this to 500px, why did they not do the same to instagram and other similar apps that allows users to post own photos that can be searched?

  • Pingback: OT, iOS7 - Page 2