Ministers warn platform could be blocked after Grok AI used to create sexual images without consent

  • FreedomAdvocate
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    22 hours ago

    No chatbot has ever caused a death.

    There’s no such thing as child porn

    There definitely is, and it falls under CSAM. Pornographic material isn’t by definition only adults.