Elon Musks Grok on Friday said it was scrambling to fix flaws in the artificial intelligence tool after users claimed it turned pictures of children or women into erotic images.
Weve identified lapses in safeguards and are urgently fixing them, Grok said in a post on X, formerly Twitter.
CSAM Child Sexual Abuse Material is illegal and prohibited.