I think CSAM related content is genuinely more of an on Meta than X, and there's evidence out there that Meta's engagement algorithms are potentially guiding users into continuously more questionable content.I don’t know why you think this is some kind of gotcha, other than Elon telling you it is. It’s not like Twitter doesn’t have its own problems with child exploitation. Elon personally approved the reinstatement of a user who was banned for showing child sex material. You’ve conveniently overlooked that fact, even though it’s been pointed out to you more than once.
That said, this is an issue Meta wants to fix, even if they seem utterly clueless as to how.