Social network Tumblr makes new changes to stay on Apple’s App Store


San Francisco: microblogging and social networking website Tumblr who faced a struggle of several years with the approval over the ios App store, said they made new changes in order to stay on top of Apple App Store.

In 2018, Tumblr IOS app has been removed from the App Store under the Child Pornographic Material Policy (CSAM).

A month later, the platform responded by banning all pornographic and other sexually explicit content, resulting in a 29% drop in monthly traffic.

Since then, the platform’s web traffic has remained relatively stagnant, reports The Verge.

“So that we stay in Apple‘s App Store and for our iOS Tumblr app to be available, we needed to make changes that would help us be more compliant with their policies regarding sensitive content, ”Tumblr said in a final blog post.

Many Tumblr users come to the platform to talk anonymously about their experiences.

The platform said that “for those of you who access Tumblr through our iOS app, we wanted to share that as of today you may see differences for search terms and recommended content. which may contain specific types of sensitive content “.

“In order to comply with Apple App Store guidelines, we need to adjust, in the short term, what you can access with respect to potentially sensitive content when using the iOS app,” the platform said.

To remain available in the Apple App Store, the company has had to expand the definition of what sensitive content is and how its users access it in order to comply with Apple guidelines.

“We understand that for some of you these changes can be very frustrating – we understand this frustration and we apologize for any disruption these changes may cause,” Tumblr said.

Apple’s CSAM feature is intended to protect children from predators who use communication tools to recruit and exploit them.

This is part of the features, including scanning users’ iCloud photo libraries for child sexual abuse material (CSAM), communication safety to alert children and their parents when receiving or leaving. uploading sexually explicit photos and extended CSAM advice in Siri and Search.



Comments are closed.