Technology

Rave removed from App Store: Apple dispute widens

Rave App – Co-viewing app Rave says Apple locked users out, revoked certificates, and gave vague reasons—while Rave faces scrutiny over moderation.

A co-streaming app built around shared viewing and chat has become the center of a high-stakes legal fight with Apple, and the dispute goes far beyond a competing feature like SharePlay.

In August 2025, Apple removed Rave from the App Store.. Rave says Apple did not provide developers with a clear, specific explanation for the decision.. The company also claims that. as a result of Apple actions. a large number of users were effectively cut off from accounts they had used for years. including by disabling “Sign in with Apple” access for the app.

Rave’s account also alleges that Apple revoked its developer certificate and that macOS then blocked the app with a malware-related warning.. In Rave’s telling. the company tried to resolve the situation through discussions with Apple. but received what it describes as only a vague clause rather than a concrete. actionable explanation.

Rave says Apple’s rationale and timeline changed over time. and that the company was eventually issued a permanent removal notice with no continued collaboration.. The dispute now sits in multiple jurisdictions: Rave is suing Apple in the United States. the Netherlands. Brazil. Russia. and its home country of Canada.

At the center of the broader narrative is competition.. Rave’s product overlaps with Apple’s own co-viewing offering. commonly associated with SharePlay. and Rave has argued that Apple’s “gatekeeper” power should not allow developers to be removed without fair process or accountability.. Rave’s lawsuit suggests it sees its removal as tied to platform leverage rather than solely technical compliance concerns.

But Rave’s claims also collide with a record of content moderation problems that predate the removal.. The company faced long-running criticism for weak oversight in its stream “rooms. ” where chat was described as running with little moderation. including situations where public viewing spaces were effectively unmoderated.

Rave’s own terms of service require users to be at least 13, which raises the stakes when moderation fails.. In Rave’s framing. under-moderated spaces could put minors at undue risk. and the company’s posture suggests that Apple could view that as a compliance issue.. This is especially sensitive in environments where user-generated chat and media can shift quickly from ordinary conversation to harmful or illegal content.

Rave has also been accused of being a magnet for serious violations. including reports of child sexual abuse material (CSAM). pornography. gore. and related content.. Rave’s situation has also been described as involving channels that encouraged users to leave the app and funnel into external services such as Telegram and Signal. which Rave suggests aligns with illegal content distribution.

Beyond content reports, Rave has also faced security-related scrutiny.. The dispute notes that, for a time, the app was considered a haven for scammers and bots.. It also states that another major company temporarily pulled Rave due to security and malware concerns not long before Apple’s August 2025 removal.

Since the takedown, Rave claims it has changed course on safety.. The company says it implemented new security and moderation measures. including AI-powered tools aimed at detecting grooming and predatory patterns in chat. age verification steps. hash matching against child safety databases. gore detection. and blocking or preventing links to external apps like Telegram.

Rave also points to anecdotal shifts in user feedback. saying that complaints about certain moderation problems have decreased over the past six months.. At the same time. it reports a growing trend of users complaining about “unfair bans. ” indicating that even after security updates. disputes may be shifting from the presence of harmful content to enforcement behavior and the accuracy of automated or policy-based moderation.

A particularly contentious thread in the dispute is that Rave is still described as allowing streaming pornography even after its security update.. Rave’s critics and users. as described in the account. cite workarounds for age verification failures and continue to discuss ways to view explicit content.. That claim matters because it directly touches on Apple’s long-running stance toward apps that enable pornographic material and the platform’s expectations for moderating user-generated content.

Apple’s decision also appears to be grounded in the fact that user-generated services require ongoing moderation. not just reactive fixes after enforcement.. Even if Rave improved parts of its moderation stack. the dispute suggests that a return to the App Store may be unlikely. especially when the app’s functionality is closely tied to streaming and chat where harmful content can reappear without strong controls.

Meanwhile. Rave’s legal complaint leans on a familiar argument used by developers when platforms wield removal power: that the gatekeeping role can discourage investment and innovation if decisions lack transparency or consistent standards.. In Rave’s view, even developers that try to comply may be left exposed when platform policies are applied unpredictably.

Whether that argument persuades a court will likely turn on what the legal process can establish about Apple’s decision-making and what Rave can substantiate about the practical impact of the removal—especially claims about locked accounts. disabled “Sign in with Apple. ” and certificate revocation.. But the other side of the story is already clear in the record Rave’s critics describe: for years. the app’s moderation was widely viewed as insufficient for a platform that has to manage user safety at scale.

For users, the dispute means shared viewing apps are not only a matter of features and interfaces.. They are also a test of how quickly a service can prevent abuse. how consistently platforms enforce policies. and whether developers get the kind of clear. stable guidance needed to fix problems before they become a permanent exit from an app ecosystem.. The outcome will shape not just Rave’s fate. but how other creators of real-time. user-driven apps think about compliance. moderation. and platform risk.

Rave App Store removal co-streaming app SharePlay competition app moderation developer certificate Sign in with Apple

Leave a Reply

Your email address will not be published. Required fields are marked *

Are you human? Please solve:Captcha


Secret Link