A letter to a U.S. Senator has brought to light a previously undisclosed behind-the-scenes struggle between Apple and X.

The incident dates back to the beginning of this year. At that time, users discovered that Grok could easily respond to requests to "undress" photo subjects, with victims including women and even minors, sparking public outrage. The public then pressured Apple to remove Grok and X apps from the App Store.

According to reports, Apple had already determined that X and Grok violated app store policies and had privately warned them: if they did not make changes, Grok would be removed. Apple then asked the X team to submit a plan for improving content review. X submitted the first version of the update, which was rejected by Apple for insufficient improvements; after revising it again and resubmitting, only one application passed the review.

In the letter, Apple clearly stated that Grok initially still did not meet the requirements and was directly rejected, warning that further improvements were needed, or else it would face removal. Only after developers made adjustments and communicated again did Apple recognize significant improvements in Grok and approved the latest version for release.

This behind-the-scenes struggle also explains the series of measures xAI later introduced—limiting image functions for some users, and tightening editing permissions involving real people's photos. Behind these actions, there were clear signs of pressure from Apple.

However, the issue has not truly been resolved. In its latest follow-up report, NBC News pointed out that in the past month, multiple cases were still recorded where Grok could generate indecent images without the consent of the individuals involved. Although the number of such content has significantly decreased compared to January, some users have still found ways to bypass the restrictions, changing female images into various revealing outfits.