Apple expands app removals as it braces for tighter global regulation

Under new rules, Apple can remove anonymous and random chat apps from the App Store without prior notice or a grace period, a move seen as preparation for intensifying global regulatory scrutiny over child safety and harmful content

Apple is tightening its App Store entry rules, warning that it now reserves the right to remove even “random or anonymous chat” apps without prior notice.
The move is an addition to Apple’s existing safety section in its guidelines for developers seeking to publish apps in the App Store. It represents a significant shift in the company’s approach to the random and anonymous nature of online connections.
3 View gallery
אפסטור
אפסטור
App Store
(Photo: Shutterstock)
Until now, the list of app categories that Apple could remove immediately from the store included pornography, physical threats and “chat roulette” services, apps that enable video conversations with random users.
Under the updated policy, the list now more broadly includes anonymous and “random” chat apps. According to the new guidelines, apps that allow anonymous conversations, prank calls or anonymous SMS and MMS messages “may be removed without prior notice,” unlike previous review processes that typically allowed a grace period to fix issues.
Apple’s standing policy states that user-generated content “poses unique challenges, from intellectual property violations to anonymous bullying.” To mitigate such risks, the company requires these apps to include mechanisms for reporting abusive content and filtering harmful material, and it explicitly details which types of apps may be removed without warning.
3 View gallery
אפל 2024
אפל 2024
Apple CEO Tim Cook
(Screenshot: YouTube)
Apple has not disclosed what prompted the new addition, particularly since the previous guidelines already covered “chat roulette-style experiences.” The prevailing assumption is that the company is seeking to get ahead of expected enforcement measures from regulators worldwide, giving itself clearer discretion to remove apps preemptively.
Last October, Apple was forced to remove an app called ICEBlock from its App Store. The app functioned as a kind of “Waze for immigration agents,” allowing users to report and view in real time the locations of US Immigration and Customs Enforcement agents. The US administration argued that the app endangered law enforcement officers’ lives. After facing widespread criticism for the removal, Apple now appears to be working to establish clearer rules to justify similar actions in the future.
3 View gallery
רשתות חברתיות
רשתות חברתיות
(Photo: Getty Images)
Last year, Apple and Google both removed the chat roulette app OmeTV from their app stores following claims, particularly from Australian authorities, that such apps endanger children. OmeTV enabled video chats with strangers worldwide, including interactions between adults and minors, without filtering, registration or clear identification of who was on the other side.
The OmeTV case set a precedent with global implications: responsibility lies with the platform itself, meaning the app store operator, even when the app’s developer does not respond to regulatory demands. Platform providers are required to proactively ensure that apps available in their stores include adequate safeguards against illegal and harmful content before damage occurs, not only in response to complaints. Regulators in the European Union and the UK are closely monitoring Australia’s enforcement model, raising the prospect that hundreds of apps could now face renewed scrutiny.
For developers, the new guidelines signal the end of the “move fast and break things” era in social app development. Strong content moderation, user verification systems and reliable age-verification mechanisms are no longer optional; they are mandatory. Developers are now expected to build strict safety measures from day one, rather than adding them after problems arise. Smaller developers and startups will need to factor additional costs into their business models.
The message is clear: design with safety as a first principle, implement meaningful age verification and content management, and understand that regulation in one country can quickly shape policy worldwide.
Comments
The commenter agrees to the privacy policy of Ynet News and agrees not to submit comments that violate the terms of use, including incitement, libel and expressions that exceed the accepted norms of freedom of speech.
""