Apple removes 500px from App Store over nude photo concerns
The removal of photo-sharing startup 500px's iOS app from the App Store is creating a lot of buzz. We're once again reminded of Apple's strict watch over the App Store.
According to Apple, 500px was removed from its App Store for providing access to nude photos.
With close to 1 million downloads, the app disappeared by noon on 22 January. The night before, the 500px team held discussions with a member of the App Store review team after submitting an updated version of the app, and was told the access to nude photos was breaking the rules.
Apple issued a detailed statement on the matter saying: "The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app."
500px COO Evgeny Tchebotarev explained to TechCrunch that it was a difficult process to find nude photos in the app and couldn't be done just by launching it.
Like Google Search, 500px had enabled a safe search mode, which could be turned off only by accessing the service's website. He further explained that pornographic images aren't allowed on the photo-sharing service and that the nude photos were meant to be artistic.
The team 500px said it would make changes, but Apple still removed the app Tuesday morning. Presumably the company will be able to submit the changes to the App Store. We'll know in the next few days.
For those unfamiliar, the 500px application first hit the App Store in 2011, providing a way for users to browse a collection of photo portfolios. The purpose of the service -- the web and Android apps are still available -- is to provide a way to "discover, share, buy and sell inspiring photographs".
As Rene Ritchie of iMore points out, this is yet another instance of a human review decision, especially when a developer is submitting a simple update. It's worth noting the nude photo search in question has been in the app for some time, and wasn't the subject of the update 500px just submitted.
In the App Store rule book, the official rule is stated: "Apps containing pornographic material, defined by Webster’s Dictionary as 'explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings', will be rejected."
The rules continue: "Apps that contain user-generated content that is frequently pornographic (ex 'Chat Roulette' Apps) will be rejected."
But by that definition, shouldn't apps like Instagram, SnapChat, Tumblr and even Apple's own Safari be pulled as they have the potential to display nude images?
With the safeguards that 500px had in place, it's a little odd that Apple took the route of deleting the app. But where the kicker for 500px seems to be is the "customer complains about possible child pornography", as that is grounds for some serious legal implications for both 500px and Apple. It's not clear if that is actually the case.
What do you think? Should this app have been removed?