According to the latest data analysis by
The CCDH report states that among the approximately 4.6 million image samples generated by Grok, as many as 65% (about 3 million) contained sexually suggestive depictions of men, women, or children. Approximately 23,000 images were identified as possibly involving sexual content related to children. This large-scale abuse stemmed from users discovering they could prompt Grok to generate "nude photos" or sexually objectify real people's pictures.

This incident has raised high alert internationally. After regulatory authorities in the UK, the US, India, and Malaysia launched

